started mongo stuff, but it's a PITA. fuck mongo

This commit is contained in:
Pablo Martin 2025-06-02 17:24:23 +02:00
parent 4cd36ea3fc
commit 0f1bc5dff3
937 changed files with 205043 additions and 0 deletions

View file

@ -0,0 +1,22 @@
# MIT License
Copyright (c) 2010-2013 LearnBoost <dev@learnboost.com>
Copyright (c) 2013-2021 Automattic
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.

View file

@ -0,0 +1,375 @@
# Mongoose
Mongoose is a [MongoDB](https://www.mongodb.org/) object modeling tool designed to work in an asynchronous environment. Mongoose supports [Node.js](https://nodejs.org/en/) and [Deno](https://deno.land/) (alpha).
[![Build Status](https://github.com/Automattic/mongoose/workflows/Test/badge.svg)](https://github.com/Automattic/mongoose)
[![NPM version](https://badge.fury.io/js/mongoose.svg)](http://badge.fury.io/js/mongoose)
[![Deno version](https://deno.land/badge/mongoose/version)](https://deno.land/x/mongoose)
[![Deno popularity](https://deno.land/badge/mongoose/popularity)](https://deno.land/x/mongoose)
[![npm](https://nodei.co/npm/mongoose.png)](https://www.npmjs.com/package/mongoose)
## Documentation
The official documentation website is [mongoosejs.com](http://mongoosejs.com/).
Mongoose 8.0.0 was released on October 31, 2023. You can find more details on [backwards breaking changes in 8.0.0 on our docs site](https://mongoosejs.com/docs/migrating_to_8.html).
## Support
* [Stack Overflow](http://stackoverflow.com/questions/tagged/mongoose)
* [Bug Reports](https://github.com/Automattic/mongoose/issues/)
* [Mongoose Slack Channel](http://slack.mongoosejs.io/)
* [Help Forum](http://groups.google.com/group/mongoose-orm)
* [MongoDB Support](https://www.mongodb.com/docs/manual/support/)
## Plugins
Check out the [plugins search site](http://plugins.mongoosejs.io/) to see hundreds of related modules from the community. Next, learn how to write your own plugin from the [docs](http://mongoosejs.com/docs/plugins.html) or [this blog post](http://thecodebarbarian.com/2015/03/06/guide-to-mongoose-plugins).
## Contributors
Pull requests are always welcome! Please base pull requests against the `master`
branch and follow the [contributing guide](https://github.com/Automattic/mongoose/blob/master/CONTRIBUTING.md).
If your pull requests makes documentation changes, please do **not**
modify any `.html` files. The `.html` files are compiled code, so please make
your changes in `docs/*.pug`, `lib/*.js`, or `test/docs/*.js`.
View all 400+ [contributors](https://github.com/Automattic/mongoose/graphs/contributors).
## Installation
First install [Node.js](http://nodejs.org/) and [MongoDB](https://www.mongodb.org/downloads). Then:
```sh
npm install mongoose
```
Mongoose 6.8.0 also includes alpha support for [Deno](https://deno.land/).
## Importing
```javascript
// Using Node.js `require()`
const mongoose = require('mongoose');
// Using ES6 imports
import mongoose from 'mongoose';
```
Or, using [Deno's `createRequire()` for CommonJS support](https://deno.land/std@0.113.0/node/README.md?source=#commonjs-modules-loading) as follows.
```javascript
import { createRequire } from 'https://deno.land/std@0.177.0/node/module.ts';
const require = createRequire(import.meta.url);
const mongoose = require('mongoose');
mongoose.connect('mongodb://127.0.0.1:27017/test')
.then(() => console.log('Connected!'));
```
You can then run the above script using the following.
```sh
deno run --allow-net --allow-read --allow-sys --allow-env mongoose-test.js
```
## Mongoose for Enterprise
Available as part of the Tidelift Subscription
The maintainers of mongoose and thousands of other packages are working with Tidelift to deliver commercial support and maintenance for the open source dependencies you use to build your applications. Save time, reduce risk, and improve code health, while paying the maintainers of the exact dependencies you use. [Learn more.](https://tidelift.com/subscription/pkg/npm-mongoose?utm_source=npm-mongoose&utm_medium=referral&utm_campaign=enterprise&utm_term=repo)
## Overview
### Connecting to MongoDB
First, we need to define a connection. If your app uses only one database, you should use `mongoose.connect`. If you need to create additional connections, use `mongoose.createConnection`.
Both `connect` and `createConnection` take a `mongodb://` URI, or the parameters `host, database, port, options`.
```js
await mongoose.connect('mongodb://127.0.0.1/my_database');
```
Once connected, the `open` event is fired on the `Connection` instance. If you're using `mongoose.connect`, the `Connection` is `mongoose.connection`. Otherwise, `mongoose.createConnection` return value is a `Connection`.
**Note:** *If the local connection fails then try using 127.0.0.1 instead of localhost. Sometimes issues may arise when the local hostname has been changed.*
**Important!** Mongoose buffers all the commands until it's connected to the database. This means that you don't have to wait until it connects to MongoDB in order to define models, run queries, etc.
### Defining a Model
Models are defined through the `Schema` interface.
```js
const Schema = mongoose.Schema;
const ObjectId = Schema.ObjectId;
const BlogPost = new Schema({
author: ObjectId,
title: String,
body: String,
date: Date
});
```
Aside from defining the structure of your documents and the types of data you're storing, a Schema handles the definition of:
* [Validators](http://mongoosejs.com/docs/validation.html) (async and sync)
* [Defaults](http://mongoosejs.com/docs/api/schematype.html#schematype_SchemaType-default)
* [Getters](http://mongoosejs.com/docs/api/schematype.html#schematype_SchemaType-get)
* [Setters](http://mongoosejs.com/docs/api/schematype.html#schematype_SchemaType-set)
* [Indexes](http://mongoosejs.com/docs/guide.html#indexes)
* [Middleware](http://mongoosejs.com/docs/middleware.html)
* [Methods](http://mongoosejs.com/docs/guide.html#methods) definition
* [Statics](http://mongoosejs.com/docs/guide.html#statics) definition
* [Plugins](http://mongoosejs.com/docs/plugins.html)
* [pseudo-JOINs](http://mongoosejs.com/docs/populate.html)
The following example shows some of these features:
```js
const Comment = new Schema({
name: { type: String, default: 'hahaha' },
age: { type: Number, min: 18, index: true },
bio: { type: String, match: /[a-z]/ },
date: { type: Date, default: Date.now },
buff: Buffer
});
// a setter
Comment.path('name').set(function(v) {
return capitalize(v);
});
// middleware
Comment.pre('save', function(next) {
notify(this.get('email'));
next();
});
```
Take a look at the example in [`examples/schema/schema.js`](https://github.com/Automattic/mongoose/blob/master/examples/schema/schema.js) for an end-to-end example of a typical setup.
### Accessing a Model
Once we define a model through `mongoose.model('ModelName', mySchema)`, we can access it through the same function
```js
const MyModel = mongoose.model('ModelName');
```
Or just do it all at once
```js
const MyModel = mongoose.model('ModelName', mySchema);
```
The first argument is the *singular* name of the collection your model is for. **Mongoose automatically looks for the *plural* version of your model name.** For example, if you use
```js
const MyModel = mongoose.model('Ticket', mySchema);
```
Then `MyModel` will use the **tickets** collection, not the **ticket** collection. For more details read the [model docs](https://mongoosejs.com/docs/api/mongoose.html#mongoose_Mongoose-model).
Once we have our model, we can then instantiate it, and save it:
```js
const instance = new MyModel();
instance.my.key = 'hello';
await instance.save();
```
Or we can find documents from the same collection
```js
await MyModel.find({});
```
You can also `findOne`, `findById`, `update`, etc.
```js
const instance = await MyModel.findOne({ /* ... */ });
console.log(instance.my.key); // 'hello'
```
For more details check out [the docs](http://mongoosejs.com/docs/queries.html).
**Important!** If you opened a separate connection using `mongoose.createConnection()` but attempt to access the model through `mongoose.model('ModelName')` it will not work as expected since it is not hooked up to an active db connection. In this case access your model through the connection you created:
```js
const conn = mongoose.createConnection('your connection string');
const MyModel = conn.model('ModelName', schema);
const m = new MyModel();
await m.save(); // works
```
vs
```js
const conn = mongoose.createConnection('your connection string');
const MyModel = mongoose.model('ModelName', schema);
const m = new MyModel();
await m.save(); // does not work b/c the default connection object was never connected
```
### Embedded Documents
In the first example snippet, we defined a key in the Schema that looks like:
```txt
comments: [Comment]
```
Where `Comment` is a `Schema` we created. This means that creating embedded documents is as simple as:
```js
// retrieve my model
const BlogPost = mongoose.model('BlogPost');
// create a blog post
const post = new BlogPost();
// create a comment
post.comments.push({ title: 'My comment' });
await post.save();
```
The same goes for removing them:
```js
const post = await BlogPost.findById(myId);
post.comments[0].deleteOne();
await post.save();
```
Embedded documents enjoy all the same features as your models. Defaults, validators, middleware.
### Middleware
See the [docs](http://mongoosejs.com/docs/middleware.html) page.
#### Intercepting and mutating method arguments
You can intercept method arguments via middleware.
For example, this would allow you to broadcast changes about your Documents every time someone `set`s a path in your Document to a new value:
```js
schema.pre('set', function(next, path, val, typel) {
// `this` is the current Document
this.emit('set', path, val);
// Pass control to the next pre
next();
});
```
Moreover, you can mutate the incoming `method` arguments so that subsequent middleware see different values for those arguments. To do so, just pass the new values to `next`:
```js
schema.pre(method, function firstPre(next, methodArg1, methodArg2) {
// Mutate methodArg1
next('altered-' + methodArg1.toString(), methodArg2);
});
// pre declaration is chainable
schema.pre(method, function secondPre(next, methodArg1, methodArg2) {
console.log(methodArg1);
// => 'altered-originalValOfMethodArg1'
console.log(methodArg2);
// => 'originalValOfMethodArg2'
// Passing no arguments to `next` automatically passes along the current argument values
// i.e., the following `next()` is equivalent to `next(methodArg1, methodArg2)`
// and also equivalent to, with the example method arg
// values, `next('altered-originalValOfMethodArg1', 'originalValOfMethodArg2')`
next();
});
```
#### Schema gotcha
`type`, when used in a schema has special meaning within Mongoose. If your schema requires using `type` as a nested property you must use object notation:
```js
new Schema({
broken: { type: Boolean },
asset: {
name: String,
type: String // uh oh, it broke. asset will be interpreted as String
}
});
new Schema({
works: { type: Boolean },
asset: {
name: String,
type: { type: String } // works. asset is an object with a type property
}
});
```
### Driver Access
Mongoose is built on top of the [official MongoDB Node.js driver](https://github.com/mongodb/node-mongodb-native). Each mongoose model keeps a reference to a [native MongoDB driver collection](http://mongodb.github.io/node-mongodb-native/2.1/api/Collection.html). The collection object can be accessed using `YourModel.collection`. However, using the collection object directly bypasses all mongoose features, including hooks, validation, etc. The one
notable exception that `YourModel.collection` still buffers
commands. As such, `YourModel.collection.find()` will **not**
return a cursor.
## API Docs
Find the API docs [here](http://mongoosejs.com/docs/api/mongoose.html), generated using [dox](https://github.com/tj/dox)
and [acquit](https://github.com/vkarpov15/acquit).
## Related Projects
### MongoDB Runners
* [run-rs](https://www.npmjs.com/package/run-rs)
* [mongodb-memory-server](https://www.npmjs.com/package/mongodb-memory-server)
* [mongodb-topology-manager](https://www.npmjs.com/package/mongodb-topology-manager)
### Unofficial CLIs
* [mongoosejs-cli](https://www.npmjs.com/package/mongoosejs-cli)
### Data Seeding
* [dookie](https://www.npmjs.com/package/dookie)
* [seedgoose](https://www.npmjs.com/package/seedgoose)
* [mongoose-data-seed](https://www.npmjs.com/package/mongoose-data-seed)
### Express Session Stores
* [connect-mongodb-session](https://www.npmjs.com/package/connect-mongodb-session)
* [connect-mongo](https://www.npmjs.com/package/connect-mongo)
## License
Copyright (c) 2010 LearnBoost &lt;dev@learnboost.com&gt;
Permission is hereby granted, free of charge, to any person obtaining
a copy of this software and associated documentation files (the
'Software'), to deal in the Software without restriction, including
without limitation the rights to use, copy, modify, merge, publish,
distribute, sublicense, and/or sell copies of the Software, and to
permit persons to whom the Software is furnished to do so, subject to
the following conditions:
The above copyright notice and this permission notice shall be
included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED 'AS IS', WITHOUT WARRANTY OF ANY KIND,
EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.
IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY
CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT,
TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE
SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

View file

@ -0,0 +1 @@
Please follow the instructions on [Tidelift's security page](https://tidelift.com/docs/security) to report a security issue.

View file

@ -0,0 +1,8 @@
/**
* Export lib/mongoose
*
*/
'use strict';
module.exports = require('./lib/browser');

File diff suppressed because one or more lines are too long

View file

@ -0,0 +1,64 @@
/**
* Export lib/mongoose
*
*/
'use strict';
const mongoose = require('./lib/');
module.exports = mongoose;
module.exports.default = mongoose;
module.exports.mongoose = mongoose;
// Re-export for ESM support
module.exports.cast = mongoose.cast;
module.exports.STATES = mongoose.STATES;
module.exports.setDriver = mongoose.setDriver;
module.exports.set = mongoose.set;
module.exports.get = mongoose.get;
module.exports.createConnection = mongoose.createConnection;
module.exports.connect = mongoose.connect;
module.exports.disconnect = mongoose.disconnect;
module.exports.startSession = mongoose.startSession;
module.exports.pluralize = mongoose.pluralize;
module.exports.model = mongoose.model;
module.exports.deleteModel = mongoose.deleteModel;
module.exports.modelNames = mongoose.modelNames;
module.exports.plugin = mongoose.plugin;
module.exports.connections = mongoose.connections;
module.exports.version = mongoose.version;
module.exports.Aggregate = mongoose.Aggregate;
module.exports.Mongoose = mongoose.Mongoose;
module.exports.Schema = mongoose.Schema;
module.exports.SchemaType = mongoose.SchemaType;
module.exports.SchemaTypes = mongoose.SchemaTypes;
module.exports.VirtualType = mongoose.VirtualType;
module.exports.Types = mongoose.Types;
module.exports.Query = mongoose.Query;
module.exports.Model = mongoose.Model;
module.exports.Document = mongoose.Document;
module.exports.ObjectId = mongoose.ObjectId;
module.exports.isValidObjectId = mongoose.isValidObjectId;
module.exports.isObjectIdOrHexString = mongoose.isObjectIdOrHexString;
module.exports.syncIndexes = mongoose.syncIndexes;
module.exports.Decimal128 = mongoose.Decimal128;
module.exports.Mixed = mongoose.Mixed;
module.exports.Date = mongoose.Date;
module.exports.Number = mongoose.Number;
module.exports.Error = mongoose.Error;
module.exports.MongooseError = mongoose.MongooseError;
module.exports.now = mongoose.now;
module.exports.CastError = mongoose.CastError;
module.exports.SchemaTypeOptions = mongoose.SchemaTypeOptions;
module.exports.mongo = mongoose.mongo;
module.exports.mquery = mongoose.mquery;
module.exports.sanitizeFilter = mongoose.sanitizeFilter;
module.exports.trusted = mongoose.trusted;
module.exports.skipMiddlewareFunction = mongoose.skipMiddlewareFunction;
module.exports.overwriteMiddlewareResult = mongoose.overwriteMiddlewareResult;
// The following properties are not exported using ESM because `setDriver()` can mutate these
// module.exports.connection = mongoose.connection;
// module.exports.Collection = mongoose.Collection;
// module.exports.Connection = mongoose.Connection;

File diff suppressed because it is too large Load diff

View file

@ -0,0 +1,141 @@
/* eslint-env browser */
'use strict';
require('./driver').set(require('./drivers/browser'));
const DocumentProvider = require('./documentProvider.js');
DocumentProvider.setBrowser(true);
/**
* The [MongooseError](https://mongoosejs.com/docs/api/error.html#Error()) constructor.
*
* @method Error
* @api public
*/
exports.Error = require('./error/index');
/**
* The Mongoose [Schema](https://mongoosejs.com/docs/api/schema.html#Schema()) constructor
*
* #### Example:
*
* const mongoose = require('mongoose');
* const Schema = mongoose.Schema;
* const CatSchema = new Schema(..);
*
* @method Schema
* @api public
*/
exports.Schema = require('./schema');
/**
* The various Mongoose Types.
*
* #### Example:
*
* const mongoose = require('mongoose');
* const array = mongoose.Types.Array;
*
* #### Types:
*
* - [Array](https://mongoosejs.com/docs/schematypes.html#arrays)
* - [Buffer](https://mongoosejs.com/docs/schematypes.html#buffers)
* - [Embedded](https://mongoosejs.com/docs/schematypes.html#schemas)
* - [DocumentArray](https://mongoosejs.com/docs/api/documentarraypath.html)
* - [Decimal128](https://mongoosejs.com/docs/api/decimal128.html#Decimal128())
* - [ObjectId](https://mongoosejs.com/docs/schematypes.html#objectids)
* - [Map](https://mongoosejs.com/docs/schematypes.html#maps)
* - [Subdocument](https://mongoosejs.com/docs/schematypes.html#schemas)
*
* Using this exposed access to the `ObjectId` type, we can construct ids on demand.
*
* const ObjectId = mongoose.Types.ObjectId;
* const id1 = new ObjectId;
*
* @property Types
* @api public
*/
exports.Types = require('./types');
/**
* The Mongoose [VirtualType](https://mongoosejs.com/docs/api/virtualtype.html#VirtualType()) constructor
*
* @method VirtualType
* @api public
*/
exports.VirtualType = require('./virtualType');
/**
* The various Mongoose SchemaTypes.
*
* #### Note:
*
* _Alias of mongoose.Schema.Types for backwards compatibility._
*
* @property SchemaTypes
* @see Schema.SchemaTypes https://mongoosejs.com/docs/api/schema.html#Schema.Types
* @api public
*/
exports.SchemaType = require('./schemaType.js');
/**
* The constructor used for schematype options
*
* @method SchemaTypeOptions
* @api public
*/
exports.SchemaTypeOptions = require('./options/schemaTypeOptions');
/**
* Internal utils
*
* @property utils
* @api private
*/
exports.utils = require('./utils.js');
/**
* The Mongoose browser [Document](/api/document.html) constructor.
*
* @method Document
* @api public
*/
exports.Document = DocumentProvider();
/**
* Return a new browser model. In the browser, a model is just
* a simplified document with a schema - it does **not** have
* functions like `findOne()`, etc.
*
* @method model
* @api public
* @param {String} name
* @param {Schema} schema
* @return Class
*/
exports.model = function(name, schema) {
class Model extends exports.Document {
constructor(obj, fields) {
super(obj, schema, fields);
}
}
Model.modelName = name;
return Model;
};
/*!
* Module exports.
*/
if (typeof window !== 'undefined') {
window.mongoose = module.exports;
window.Buffer = Buffer;
}

View file

@ -0,0 +1,101 @@
/*!
* Module dependencies.
*/
'use strict';
const NodeJSDocument = require('./document');
const EventEmitter = require('events').EventEmitter;
const MongooseError = require('./error/index');
const Schema = require('./schema');
const ObjectId = require('./types/objectid');
const ValidationError = MongooseError.ValidationError;
const applyHooks = require('./helpers/model/applyHooks');
const isObject = require('./helpers/isObject');
/**
* Document constructor.
*
* @param {Object} obj the values to set
* @param {Object} schema
* @param {Object} [fields] optional object containing the fields which were selected in the query returning this document and any populated paths data
* @param {Boolean} [skipId] bool, should we auto create an ObjectId _id
* @inherits NodeJS EventEmitter https://nodejs.org/api/events.html#class-eventemitter
* @event `init`: Emitted on a document after it has was retrieved from the db and fully hydrated by Mongoose.
* @event `save`: Emitted when the document is successfully saved
* @api private
*/
function Document(obj, schema, fields, skipId, skipInit) {
if (!(this instanceof Document)) {
return new Document(obj, schema, fields, skipId, skipInit);
}
if (isObject(schema) && !schema.instanceOfSchema) {
schema = new Schema(schema);
}
// When creating EmbeddedDocument, it already has the schema and he doesn't need the _id
schema = this.schema || schema;
// Generate ObjectId if it is missing, but it requires a scheme
if (!this.schema && schema.options._id) {
obj = obj || {};
if (obj._id === undefined) {
obj._id = new ObjectId();
}
}
if (!schema) {
throw new MongooseError.MissingSchemaError();
}
this.$__setSchema(schema);
NodeJSDocument.call(this, obj, fields, skipId, skipInit);
applyHooks(this, schema, { decorateDoc: true });
// apply methods
for (const m in schema.methods) {
this[m] = schema.methods[m];
}
// apply statics
for (const s in schema.statics) {
this[s] = schema.statics[s];
}
}
/*!
* Inherit from the NodeJS document
*/
Document.prototype = Object.create(NodeJSDocument.prototype);
Document.prototype.constructor = Document;
/*!
* ignore
*/
Document.events = new EventEmitter();
/*!
* Browser doc exposes the event emitter API
*/
Document.$emitter = new EventEmitter();
['on', 'once', 'emit', 'listeners', 'removeListener', 'setMaxListeners',
'removeAllListeners', 'addListener'].forEach(function(emitterFn) {
Document[emitterFn] = function() {
return Document.$emitter[emitterFn].apply(Document.$emitter, arguments);
};
});
/*!
* Module exports.
*/
Document.ValidationError = ValidationError;
module.exports = exports = Document;

View file

@ -0,0 +1,444 @@
'use strict';
/*!
* Module dependencies.
*/
const CastError = require('./error/cast');
const StrictModeError = require('./error/strict');
const Types = require('./schema/index');
const cast$expr = require('./helpers/query/cast$expr');
const castString = require('./cast/string');
const castTextSearch = require('./schema/operators/text');
const get = require('./helpers/get');
const getSchemaDiscriminatorByValue = require('./helpers/discriminator/getSchemaDiscriminatorByValue');
const isOperator = require('./helpers/query/isOperator');
const util = require('util');
const isObject = require('./helpers/isObject');
const isMongooseObject = require('./helpers/isMongooseObject');
const utils = require('./utils');
const ALLOWED_GEOWITHIN_GEOJSON_TYPES = ['Polygon', 'MultiPolygon'];
/**
* Handles internal casting for query filters.
*
* @param {Schema} schema
* @param {Object} obj Object to cast
* @param {Object} [options] the query options
* @param {Boolean|"throw"} [options.strict] Wheter to enable all strict options
* @param {Boolean|"throw"} [options.strictQuery] Enable strict Queries
* @param {Boolean} [options.sanitizeFilter] avoid adding implict query selectors ($in)
* @param {Boolean} [options.upsert]
* @param {Query} [context] passed to setters
* @api private
*/
module.exports = function cast(schema, obj, options, context) {
if (Array.isArray(obj)) {
throw new Error('Query filter must be an object, got an array ', util.inspect(obj));
}
if (obj == null) {
return obj;
}
if (schema != null && schema.discriminators != null && obj[schema.options.discriminatorKey] != null) {
schema = getSchemaDiscriminatorByValue(schema, obj[schema.options.discriminatorKey]) || schema;
}
const paths = Object.keys(obj);
let i = paths.length;
let _keys;
let any$conditionals;
let schematype;
let nested;
let path;
let type;
let val;
options = options || {};
while (i--) {
path = paths[i];
val = obj[path];
if (path === '$or' || path === '$nor' || path === '$and') {
if (!Array.isArray(val)) {
throw new CastError('Array', val, path);
}
for (let k = val.length - 1; k >= 0; k--) {
if (val[k] == null || typeof val[k] !== 'object') {
throw new CastError('Object', val[k], path + '.' + k);
}
const beforeCastKeysLength = Object.keys(val[k]).length;
const discriminatorValue = val[k][schema.options.discriminatorKey];
if (discriminatorValue == null) {
val[k] = cast(schema, val[k], options, context);
} else {
const discriminatorSchema = getSchemaDiscriminatorByValue(context.schema, discriminatorValue);
val[k] = cast(discriminatorSchema ? discriminatorSchema : schema, val[k], options, context);
}
if (Object.keys(val[k]).length === 0 && beforeCastKeysLength !== 0) {
val.splice(k, 1);
}
}
// delete empty: {$or: []} -> {}
if (val.length === 0) {
delete obj[path];
}
} else if (path === '$where') {
type = typeof val;
if (type !== 'string' && type !== 'function') {
throw new Error('Must have a string or function for $where');
}
if (type === 'function') {
obj[path] = val.toString();
}
continue;
} else if (path === '$expr') {
val = cast$expr(val, schema);
continue;
} else if (path === '$elemMatch') {
val = cast(schema, val, options, context);
} else if (path === '$text') {
val = castTextSearch(val, path);
} else if (path === '$comment' && !schema.paths.hasOwnProperty('$comment')) {
val = castString(val, path);
obj[path] = val;
} else {
if (!schema) {
// no casting for Mixed types
continue;
}
schematype = schema.path(path);
// Check for embedded discriminator paths
if (!schematype) {
const split = path.split('.');
let j = split.length;
while (j--) {
const pathFirstHalf = split.slice(0, j).join('.');
const pathLastHalf = split.slice(j).join('.');
const _schematype = schema.path(pathFirstHalf);
const discriminatorKey = _schematype &&
_schematype.schema &&
_schematype.schema.options &&
_schematype.schema.options.discriminatorKey;
// gh-6027: if we haven't found the schematype but this path is
// underneath an embedded discriminator and the embedded discriminator
// key is in the query, use the embedded discriminator schema
if (_schematype != null &&
(_schematype.schema && _schematype.schema.discriminators) != null &&
discriminatorKey != null &&
pathLastHalf !== discriminatorKey) {
const discriminatorVal = get(obj, pathFirstHalf + '.' + discriminatorKey);
const discriminators = _schematype.schema.discriminators;
if (typeof discriminatorVal === 'string' && discriminators[discriminatorVal] != null) {
schematype = discriminators[discriminatorVal].path(pathLastHalf);
} else if (discriminatorVal != null &&
Object.keys(discriminatorVal).length === 1 &&
Array.isArray(discriminatorVal.$in) &&
discriminatorVal.$in.length === 1 &&
typeof discriminatorVal.$in[0] === 'string' &&
discriminators[discriminatorVal.$in[0]] != null) {
schematype = discriminators[discriminatorVal.$in[0]].path(pathLastHalf);
}
}
}
}
if (!schematype) {
// Handle potential embedded array queries
const split = path.split('.');
let j = split.length;
let pathFirstHalf;
let pathLastHalf;
let remainingConds;
// Find the part of the var path that is a path of the Schema
while (j--) {
pathFirstHalf = split.slice(0, j).join('.');
schematype = schema.path(pathFirstHalf);
if (schematype) {
break;
}
}
// If a substring of the input path resolves to an actual real path...
if (schematype) {
// Apply the casting; similar code for $elemMatch in schema/array.js
if (schematype.caster && schematype.caster.schema) {
remainingConds = {};
pathLastHalf = split.slice(j).join('.');
remainingConds[pathLastHalf] = val;
const ret = cast(schematype.caster.schema, remainingConds, options, context)[pathLastHalf];
if (ret === void 0) {
delete obj[path];
} else {
obj[path] = ret;
}
} else {
obj[path] = val;
}
continue;
}
if (isObject(val)) {
// handle geo schemas that use object notation
// { loc: { long: Number, lat: Number }
let geo = '';
if (val.$near) {
geo = '$near';
} else if (val.$nearSphere) {
geo = '$nearSphere';
} else if (val.$within) {
geo = '$within';
} else if (val.$geoIntersects) {
geo = '$geoIntersects';
} else if (val.$geoWithin) {
geo = '$geoWithin';
}
if (geo) {
const numbertype = new Types.Number('__QueryCasting__');
let value = val[geo];
if (val.$maxDistance != null) {
val.$maxDistance = numbertype.castForQuery(
null,
val.$maxDistance,
context
);
}
if (val.$minDistance != null) {
val.$minDistance = numbertype.castForQuery(
null,
val.$minDistance,
context
);
}
if (geo === '$within') {
const withinType = value.$center
|| value.$centerSphere
|| value.$box
|| value.$polygon;
if (!withinType) {
throw new Error('Bad $within parameter: ' + JSON.stringify(val));
}
value = withinType;
} else if (geo === '$near' &&
typeof value.type === 'string' && Array.isArray(value.coordinates)) {
// geojson; cast the coordinates
value = value.coordinates;
} else if ((geo === '$near' || geo === '$nearSphere' || geo === '$geoIntersects') &&
value.$geometry && typeof value.$geometry.type === 'string' &&
Array.isArray(value.$geometry.coordinates)) {
if (value.$maxDistance != null) {
value.$maxDistance = numbertype.castForQuery(
null,
value.$maxDistance,
context
);
}
if (value.$minDistance != null) {
value.$minDistance = numbertype.castForQuery(
null,
value.$minDistance,
context
);
}
if (isMongooseObject(value.$geometry)) {
value.$geometry = value.$geometry.toObject({
transform: false,
virtuals: false
});
}
value = value.$geometry.coordinates;
} else if (geo === '$geoWithin') {
if (value.$geometry) {
if (isMongooseObject(value.$geometry)) {
value.$geometry = value.$geometry.toObject({ virtuals: false });
}
const geoWithinType = value.$geometry.type;
if (ALLOWED_GEOWITHIN_GEOJSON_TYPES.indexOf(geoWithinType) === -1) {
throw new Error('Invalid geoJSON type for $geoWithin "' +
geoWithinType + '", must be "Polygon" or "MultiPolygon"');
}
value = value.$geometry.coordinates;
} else {
value = value.$box || value.$polygon || value.$center ||
value.$centerSphere;
if (isMongooseObject(value)) {
value = value.toObject({ virtuals: false });
}
}
}
_cast(value, numbertype, context);
continue;
}
}
if (schema.nested[path]) {
continue;
}
const strict = 'strict' in options ? options.strict : schema.options.strict;
const strictQuery = getStrictQuery(options, schema._userProvidedOptions, schema.options, context);
if (options.upsert && strict) {
if (strict === 'throw') {
throw new StrictModeError(path);
}
throw new StrictModeError(path, 'Path "' + path + '" is not in ' +
'schema, strict mode is `true`, and upsert is `true`.');
} if (strictQuery === 'throw') {
throw new StrictModeError(path, 'Path "' + path + '" is not in ' +
'schema and strictQuery is \'throw\'.');
} else if (strictQuery) {
delete obj[path];
}
} else if (val == null) {
continue;
} else if (utils.isPOJO(val)) {
any$conditionals = Object.keys(val).some(isOperator);
if (!any$conditionals) {
obj[path] = schematype.castForQuery(
null,
val,
context
);
} else {
const ks = Object.keys(val);
let $cond;
let k = ks.length;
while (k--) {
$cond = ks[k];
nested = val[$cond];
if ($cond === '$elemMatch') {
if (nested && schematype != null && schematype.schema != null) {
cast(schematype.schema, nested, options, context);
} else if (nested && schematype != null && schematype.$isMongooseArray) {
if (utils.isPOJO(nested) && nested.$not != null) {
cast(schema, nested, options, context);
} else {
val[$cond] = schematype.castForQuery(
$cond,
nested,
context
);
}
}
} else if ($cond === '$not') {
if (nested && schematype) {
_keys = Object.keys(nested);
if (_keys.length && isOperator(_keys[0])) {
for (const key in nested) {
nested[key] = schematype.castForQuery(
key,
nested[key],
context
);
}
} else {
val[$cond] = schematype.castForQuery(
$cond,
nested,
context
);
}
continue;
}
} else {
val[$cond] = schematype.castForQuery(
$cond,
nested,
context
);
}
}
}
} else if (Array.isArray(val) && ['Buffer', 'Array'].indexOf(schematype.instance) === -1 && !options.sanitizeFilter) {
const casted = [];
const valuesArray = val;
for (const _val of valuesArray) {
casted.push(schematype.castForQuery(
null,
_val,
context
));
}
obj[path] = { $in: casted };
} else {
obj[path] = schematype.castForQuery(
null,
val,
context
);
}
}
}
return obj;
};
function _cast(val, numbertype, context) {
if (Array.isArray(val)) {
val.forEach(function(item, i) {
if (Array.isArray(item) || isObject(item)) {
return _cast(item, numbertype, context);
}
val[i] = numbertype.castForQuery(null, item, context);
});
} else {
const nearKeys = Object.keys(val);
let nearLen = nearKeys.length;
while (nearLen--) {
const nkey = nearKeys[nearLen];
const item = val[nkey];
if (Array.isArray(item) || isObject(item)) {
_cast(item, numbertype, context);
val[nkey] = item;
} else {
val[nkey] = numbertype.castForQuery({ val: item, context: context });
}
}
}
}
function getStrictQuery(queryOptions, schemaUserProvidedOptions, schemaOptions, context) {
if ('strictQuery' in queryOptions) {
return queryOptions.strictQuery;
}
if ('strictQuery' in schemaUserProvidedOptions) {
return schemaUserProvidedOptions.strictQuery;
}
const mongooseOptions = context &&
context.mongooseCollection &&
context.mongooseCollection.conn &&
context.mongooseCollection.conn.base &&
context.mongooseCollection.conn.base.options;
if (mongooseOptions) {
if ('strictQuery' in mongooseOptions) {
return mongooseOptions.strictQuery;
}
}
return schemaOptions.strictQuery;
}

View file

@ -0,0 +1,46 @@
'use strict';
const { Long } = require('bson');
/**
* Given a value, cast it to a BigInt, or throw an `Error` if the value
* cannot be casted. `null` and `undefined` are considered valid.
*
* @param {Any} value
* @return {Number}
* @throws {Error} if `value` is not one of the allowed values
* @api private
*/
const MAX_BIGINT = 9223372036854775807n;
const MIN_BIGINT = -9223372036854775808n;
const ERROR_MESSAGE = `Mongoose only supports BigInts between ${MIN_BIGINT} and ${MAX_BIGINT} because MongoDB does not support arbitrary precision integers`;
module.exports = function castBigInt(val) {
if (val == null) {
return val;
}
if (val === '') {
return null;
}
if (typeof val === 'bigint') {
if (val > MAX_BIGINT || val < MIN_BIGINT) {
throw new Error(ERROR_MESSAGE);
}
return val;
}
if (val instanceof Long) {
return val.toBigInt();
}
if (typeof val === 'string' || typeof val === 'number') {
val = BigInt(val);
if (val > MAX_BIGINT || val < MIN_BIGINT) {
throw new Error(ERROR_MESSAGE);
}
return val;
}
throw new Error(`Cannot convert value to BigInt: "${val}"`);
};

View file

@ -0,0 +1,32 @@
'use strict';
const CastError = require('../error/cast');
/**
* Given a value, cast it to a boolean, or throw a `CastError` if the value
* cannot be casted. `null` and `undefined` are considered valid.
*
* @param {Any} value
* @param {String} [path] optional the path to set on the CastError
* @return {Boolean|null|undefined}
* @throws {CastError} if `value` is not one of the allowed values
* @api private
*/
module.exports = function castBoolean(value, path) {
if (module.exports.convertToTrue.has(value)) {
return true;
}
if (module.exports.convertToFalse.has(value)) {
return false;
}
if (value == null) {
return value;
}
throw new CastError('boolean', value, path);
};
module.exports.convertToTrue = new Set([true, 'true', 1, '1', 'yes']);
module.exports.convertToFalse = new Set([false, 'false', 0, '0', 'no']);

View file

@ -0,0 +1,41 @@
'use strict';
const assert = require('assert');
module.exports = function castDate(value) {
// Support empty string because of empty form values. Originally introduced
// in https://github.com/Automattic/mongoose/commit/efc72a1898fc3c33a319d915b8c5463a22938dfe
if (value == null || value === '') {
return null;
}
if (value instanceof Date) {
assert.ok(!isNaN(value.valueOf()));
return value;
}
let date;
assert.ok(typeof value !== 'boolean');
if (value instanceof Number || typeof value === 'number') {
date = new Date(value);
} else if (typeof value === 'string' && !isNaN(Number(value)) && (Number(value) >= 275761 || Number(value) < -271820)) {
// string representation of milliseconds take this path
date = new Date(Number(value));
} else if (typeof value.valueOf === 'function') {
// support for moment.js. This is also the path strings will take because
// strings have a `valueOf()`
date = new Date(value.valueOf());
} else {
// fallback
date = new Date(value);
}
if (!isNaN(date.valueOf())) {
return date;
}
assert.ok(false);
};

View file

@ -0,0 +1,39 @@
'use strict';
const Decimal128Type = require('../types/decimal128');
const assert = require('assert');
module.exports = function castDecimal128(value) {
if (value == null) {
return value;
}
if (typeof value === 'object' && typeof value.$numberDecimal === 'string') {
return Decimal128Type.fromString(value.$numberDecimal);
}
if (value instanceof Decimal128Type) {
return value;
}
if (typeof value === 'string') {
return Decimal128Type.fromString(value);
}
if (typeof Buffer === 'function' && Buffer.isBuffer(value)) {
return new Decimal128Type(value);
}
if (typeof Uint8Array === 'function' && value instanceof Uint8Array) {
return new Decimal128Type(value);
}
if (typeof value === 'number') {
return Decimal128Type.fromString(String(value));
}
if (typeof value.valueOf === 'function' && typeof value.valueOf() === 'string') {
return Decimal128Type.fromString(value.valueOf());
}
assert.ok(false);
};

View file

@ -0,0 +1,50 @@
'use strict';
const assert = require('assert');
const BSON = require('bson');
const isBsonType = require('../helpers/isBsonType');
/**
* Given a value, cast it to a IEEE 754-2008 floating point, or throw an `Error` if the value
* cannot be casted. `null`, `undefined`, and `NaN` are considered valid inputs.
*
* @param {Any} value
* @return {Number}
* @throws {Error} if `value` does not represent a IEEE 754-2008 floating point. If casting from a string, see [BSON Double.fromString API documentation](https://mongodb.github.io/node-mongodb-native/Next/classes/BSON.Double.html#fromString)
* @api private
*/
module.exports = function castDouble(val) {
if (val == null || val === '') {
return null;
}
let coercedVal;
if (isBsonType(val, 'Long')) {
coercedVal = val.toNumber();
} else if (typeof val === 'string') {
try {
coercedVal = BSON.Double.fromString(val);
return coercedVal;
} catch {
assert.ok(false);
}
} else if (typeof val === 'object') {
const tempVal = val.valueOf() ?? val.toString();
// ex: { a: 'im an object, valueOf: () => 'helloworld' } // throw an error
if (typeof tempVal === 'string') {
try {
coercedVal = BSON.Double.fromString(val);
return coercedVal;
} catch {
assert.ok(false);
}
} else {
coercedVal = Number(tempVal);
}
} else {
coercedVal = Number(val);
}
return new BSON.Double(coercedVal);
};

View file

@ -0,0 +1,36 @@
'use strict';
const isBsonType = require('../helpers/isBsonType');
const assert = require('assert');
/**
* Given a value, cast it to a Int32, or throw an `Error` if the value
* cannot be casted. `null` and `undefined` are considered valid.
*
* @param {Any} value
* @return {Number}
* @throws {Error} if `value` does not represent an integer, or is outside the bounds of an 32-bit integer.
* @api private
*/
module.exports = function castInt32(val) {
if (val == null) {
return val;
}
if (val === '') {
return null;
}
const coercedVal = isBsonType(val, 'Long') ? val.toNumber() : Number(val);
const INT32_MAX = 0x7FFFFFFF;
const INT32_MIN = -0x80000000;
if (coercedVal === (coercedVal | 0) &&
coercedVal >= INT32_MIN &&
coercedVal <= INT32_MAX
) {
return coercedVal;
}
assert.ok(false);
};

View file

@ -0,0 +1,42 @@
'use strict';
const assert = require('assert');
/**
* Given a value, cast it to a number, or throw an `Error` if the value
* cannot be casted. `null` and `undefined` are considered valid.
*
* @param {Any} value
* @return {Number}
* @throws {Error} if `value` is not one of the allowed values
* @api private
*/
module.exports = function castNumber(val) {
if (val == null) {
return val;
}
if (val === '') {
return null;
}
if (typeof val === 'string' || typeof val === 'boolean') {
val = Number(val);
}
assert.ok(!isNaN(val));
if (val instanceof Number) {
return val.valueOf();
}
if (typeof val === 'number') {
return val;
}
if (!Array.isArray(val) && typeof val.valueOf === 'function') {
return Number(val.valueOf());
}
if (val.toString && !Array.isArray(val) && val.toString() == Number(val)) {
return Number(val);
}
assert.ok(false);
};

View file

@ -0,0 +1,29 @@
'use strict';
const isBsonType = require('../helpers/isBsonType');
const ObjectId = require('../types/objectid');
module.exports = function castObjectId(value) {
if (value == null) {
return value;
}
if (isBsonType(value, 'ObjectId')) {
return value;
}
if (value._id) {
if (isBsonType(value._id, 'ObjectId')) {
return value._id;
}
if (value._id.toString instanceof Function) {
return new ObjectId(value._id.toString());
}
}
if (value.toString instanceof Function) {
return new ObjectId(value.toString());
}
return new ObjectId(value);
};

View file

@ -0,0 +1,37 @@
'use strict';
const CastError = require('../error/cast');
/**
* Given a value, cast it to a string, or throw a `CastError` if the value
* cannot be casted. `null` and `undefined` are considered valid.
*
* @param {Any} value
* @param {String} [path] optional the path to set on the CastError
* @return {string|null|undefined}
* @throws {CastError}
* @api private
*/
module.exports = function castString(value, path) {
// If null or undefined
if (value == null) {
return value;
}
// handle documents being passed
if (value._id && typeof value._id === 'string') {
return value._id;
}
// Re: gh-647 and gh-3030, we're ok with casting using `toString()`
// **unless** its the default Object.toString, because "[object Object]"
// doesn't really qualify as useful data
if (value.toString &&
value.toString !== Object.prototype.toString &&
!Array.isArray(value)) {
return value.toString();
}
throw new CastError('string', value, path);
};

View file

@ -0,0 +1,78 @@
'use strict';
const MongooseBuffer = require('../types/buffer');
const UUID_FORMAT = /[0-9a-f]{8}-[0-9a-f]{4}-[0-9][0-9a-f]{3}-[89ab][0-9a-f]{3}-[0-9a-f]{12}/i;
const Binary = MongooseBuffer.Binary;
module.exports = function castUUID(value) {
if (value == null) {
return value;
}
function newBuffer(initbuff) {
const buff = new MongooseBuffer(initbuff);
buff._subtype = 4;
return buff;
}
if (typeof value === 'string') {
if (UUID_FORMAT.test(value)) {
return stringToBinary(value);
} else {
throw new Error(`"${value}" is not a valid UUID string`);
}
}
if (Buffer.isBuffer(value)) {
return newBuffer(value);
}
if (value instanceof Binary) {
return newBuffer(value.value(true));
}
// Re: gh-647 and gh-3030, we're ok with casting using `toString()`
// **unless** its the default Object.toString, because "[object Object]"
// doesn't really qualify as useful data
if (value.toString && value.toString !== Object.prototype.toString) {
if (UUID_FORMAT.test(value.toString())) {
return stringToBinary(value.toString());
}
}
throw new Error(`"${value}" cannot be casted to a UUID`);
};
module.exports.UUID_FORMAT = UUID_FORMAT;
/**
* Helper function to convert the input hex-string to a buffer
* @param {String} hex The hex string to convert
* @returns {Buffer} The hex as buffer
* @api private
*/
function hex2buffer(hex) {
// use buffer built-in function to convert from hex-string to buffer
const buff = hex != null && Buffer.from(hex, 'hex');
return buff;
}
/**
* Convert a String to Binary
* @param {String} uuidStr The value to process
* @returns {MongooseBuffer} The binary to store
* @api private
*/
function stringToBinary(uuidStr) {
// Protect against undefined & throwing err
if (typeof uuidStr !== 'string') uuidStr = '';
const hex = uuidStr.replace(/[{}-]/g, ''); // remove extra characters
const bytes = hex2buffer(hex);
const buff = new MongooseBuffer(bytes);
buff._subtype = 4;
return buff;
}

View file

@ -0,0 +1,321 @@
'use strict';
/*!
* Module dependencies.
*/
const EventEmitter = require('events').EventEmitter;
const STATES = require('./connectionState');
const immediate = require('./helpers/immediate');
/**
* Abstract Collection constructor
*
* This is the base class that drivers inherit from and implement.
*
* @param {String} name name of the collection
* @param {Connection} conn A MongooseConnection instance
* @param {Object} [opts] optional collection options
* @api public
*/
function Collection(name, conn, opts) {
if (opts === void 0) {
opts = {};
}
this.opts = opts;
this.name = name;
this.collectionName = name;
this.conn = conn;
this.queue = [];
this.buffer = !conn?._hasOpened;
this.emitter = new EventEmitter();
if (STATES.connected === this.conn.readyState) {
this.onOpen();
}
}
/**
* The collection name
*
* @api public
* @property name
*/
Collection.prototype.name;
/**
* The collection name
*
* @api public
* @property collectionName
*/
Collection.prototype.collectionName;
/**
* The Connection instance
*
* @api public
* @property conn
*/
Collection.prototype.conn;
/**
* Called when the database connects
*
* @api private
*/
Collection.prototype.onOpen = function() {
this.buffer = false;
immediate(() => this.doQueue());
};
/**
* Called when the database disconnects
*
* @api private
*/
Collection.prototype.onClose = function() {};
/**
* Queues a method for later execution when its
* database connection opens.
*
* @param {String} name name of the method to queue
* @param {Array} args arguments to pass to the method when executed
* @api private
*/
Collection.prototype.addQueue = function(name, args) {
this.queue.push([name, args]);
return this;
};
/**
* Removes a queued method
*
* @param {String} name name of the method to queue
* @param {Array} args arguments to pass to the method when executed
* @api private
*/
Collection.prototype.removeQueue = function(name, args) {
const index = this.queue.findIndex(v => v[0] === name && v[1] === args);
if (index === -1) {
return false;
}
this.queue.splice(index, 1);
return true;
};
/**
* Executes all queued methods and clears the queue.
*
* @api private
*/
Collection.prototype.doQueue = function() {
for (const method of this.queue) {
if (typeof method[0] === 'function') {
method[0].apply(this, method[1]);
} else {
this[method[0]].apply(this, method[1]);
}
}
this.queue = [];
const _this = this;
immediate(function() {
_this.emitter.emit('queue');
});
return this;
};
/**
* Abstract method that drivers must implement.
*/
Collection.prototype.ensureIndex = function() {
throw new Error('Collection#ensureIndex unimplemented by driver');
};
/**
* Abstract method that drivers must implement.
*/
Collection.prototype.createIndex = function() {
throw new Error('Collection#createIndex unimplemented by driver');
};
/**
* Abstract method that drivers must implement.
*/
Collection.prototype.findAndModify = function() {
throw new Error('Collection#findAndModify unimplemented by driver');
};
/**
* Abstract method that drivers must implement.
*/
Collection.prototype.findOneAndUpdate = function() {
throw new Error('Collection#findOneAndUpdate unimplemented by driver');
};
/**
* Abstract method that drivers must implement.
*/
Collection.prototype.findOneAndDelete = function() {
throw new Error('Collection#findOneAndDelete unimplemented by driver');
};
/**
* Abstract method that drivers must implement.
*/
Collection.prototype.findOneAndReplace = function() {
throw new Error('Collection#findOneAndReplace unimplemented by driver');
};
/**
* Abstract method that drivers must implement.
*/
Collection.prototype.findOne = function() {
throw new Error('Collection#findOne unimplemented by driver');
};
/**
* Abstract method that drivers must implement.
*/
Collection.prototype.find = function() {
throw new Error('Collection#find unimplemented by driver');
};
/**
* Abstract method that drivers must implement.
*/
Collection.prototype.insert = function() {
throw new Error('Collection#insert unimplemented by driver');
};
/**
* Abstract method that drivers must implement.
*/
Collection.prototype.insertOne = function() {
throw new Error('Collection#insertOne unimplemented by driver');
};
/**
* Abstract method that drivers must implement.
*/
Collection.prototype.insertMany = function() {
throw new Error('Collection#insertMany unimplemented by driver');
};
/**
* Abstract method that drivers must implement.
*/
Collection.prototype.save = function() {
throw new Error('Collection#save unimplemented by driver');
};
/**
* Abstract method that drivers must implement.
*/
Collection.prototype.updateOne = function() {
throw new Error('Collection#updateOne unimplemented by driver');
};
/**
* Abstract method that drivers must implement.
*/
Collection.prototype.updateMany = function() {
throw new Error('Collection#updateMany unimplemented by driver');
};
/**
* Abstract method that drivers must implement.
*/
Collection.prototype.deleteOne = function() {
throw new Error('Collection#deleteOne unimplemented by driver');
};
/**
* Abstract method that drivers must implement.
*/
Collection.prototype.deleteMany = function() {
throw new Error('Collection#deleteMany unimplemented by driver');
};
/**
* Abstract method that drivers must implement.
*/
Collection.prototype.getIndexes = function() {
throw new Error('Collection#getIndexes unimplemented by driver');
};
/**
* Abstract method that drivers must implement.
*/
Collection.prototype.watch = function() {
throw new Error('Collection#watch unimplemented by driver');
};
/*!
* ignore
*/
Collection.prototype._shouldBufferCommands = function _shouldBufferCommands() {
const opts = this.opts;
if (opts.bufferCommands != null) {
return opts.bufferCommands;
}
if (opts && opts.schemaUserProvidedOptions != null && opts.schemaUserProvidedOptions.bufferCommands != null) {
return opts.schemaUserProvidedOptions.bufferCommands;
}
return this.conn._shouldBufferCommands();
};
/*!
* ignore
*/
Collection.prototype._getBufferTimeoutMS = function _getBufferTimeoutMS() {
const conn = this.conn;
const opts = this.opts;
if (opts.bufferTimeoutMS != null) {
return opts.bufferTimeoutMS;
}
if (opts && opts.schemaUserProvidedOptions != null && opts.schemaUserProvidedOptions.bufferTimeoutMS != null) {
return opts.schemaUserProvidedOptions.bufferTimeoutMS;
}
return conn._getBufferTimeoutMS();
};
/*!
* Module exports.
*/
module.exports = Collection;

File diff suppressed because it is too large Load diff

View file

@ -0,0 +1,26 @@
/*!
* Connection states
*/
'use strict';
const STATES = module.exports = exports = Object.create(null);
const disconnected = 'disconnected';
const connected = 'connected';
const connecting = 'connecting';
const disconnecting = 'disconnecting';
const uninitialized = 'uninitialized';
STATES[0] = disconnected;
STATES[1] = connected;
STATES[2] = connecting;
STATES[3] = disconnecting;
STATES[99] = uninitialized;
STATES[disconnected] = 0;
STATES[connected] = 1;
STATES[connecting] = 2;
STATES[disconnecting] = 3;
STATES[uninitialized] = 99;

View file

@ -0,0 +1,73 @@
'use strict';
/*!
* ignore
*/
const queryOperations = Object.freeze([
// Read
'countDocuments',
'distinct',
'estimatedDocumentCount',
'find',
'findOne',
// Update
'findOneAndReplace',
'findOneAndUpdate',
'replaceOne',
'updateMany',
'updateOne',
// Delete
'deleteMany',
'deleteOne',
'findOneAndDelete'
]);
exports.queryOperations = queryOperations;
/*!
* ignore
*/
const queryMiddlewareFunctions = queryOperations.concat([
'validate'
]);
exports.queryMiddlewareFunctions = queryMiddlewareFunctions;
/*!
* ignore
*/
const aggregateMiddlewareFunctions = [
'aggregate'
];
exports.aggregateMiddlewareFunctions = aggregateMiddlewareFunctions;
/*!
* ignore
*/
const modelMiddlewareFunctions = [
'bulkWrite',
'createCollection',
'insertMany'
];
exports.modelMiddlewareFunctions = modelMiddlewareFunctions;
/*!
* ignore
*/
const documentMiddlewareFunctions = [
'validate',
'save',
'remove',
'updateOne',
'deleteOne',
'init'
];
exports.documentMiddlewareFunctions = documentMiddlewareFunctions;

View file

@ -0,0 +1,476 @@
/*!
* Module dependencies.
*/
'use strict';
const MongooseError = require('../error/mongooseError');
const Readable = require('stream').Readable;
const eachAsync = require('../helpers/cursor/eachAsync');
const immediate = require('../helpers/immediate');
const kareem = require('kareem');
const util = require('util');
/**
* An AggregationCursor is a concurrency primitive for processing aggregation
* results one document at a time. It is analogous to QueryCursor.
*
* An AggregationCursor fulfills the Node.js streams3 API,
* in addition to several other mechanisms for loading documents from MongoDB
* one at a time.
*
* Creating an AggregationCursor executes the model's pre aggregate hooks,
* but **not** the model's post aggregate hooks.
*
* Unless you're an advanced user, do **not** instantiate this class directly.
* Use [`Aggregate#cursor()`](https://mongoosejs.com/docs/api/aggregate.html#Aggregate.prototype.cursor()) instead.
*
* @param {Aggregate} agg
* @inherits Readable https://nodejs.org/api/stream.html#class-streamreadable
* @event `cursor`: Emitted when the cursor is created
* @event `error`: Emitted when an error occurred
* @event `data`: Emitted when the stream is flowing and the next doc is ready
* @event `end`: Emitted when the stream is exhausted
* @api public
*/
function AggregationCursor(agg) {
// set autoDestroy=true because on node 12 it's by default false
// gh-10902 need autoDestroy to destroy correctly and emit 'close' event
Readable.call(this, { autoDestroy: true, objectMode: true });
this.cursor = null;
this.agg = agg;
this._transforms = [];
const connection = agg._connection;
const model = agg._model;
delete agg.options.cursor.useMongooseAggCursor;
this._mongooseOptions = {};
if (connection) {
this.cursor = connection.db.aggregate(agg._pipeline, agg.options || {});
setImmediate(() => this.emit('cursor', this.cursor));
} else {
_init(model, this, agg);
}
}
util.inherits(AggregationCursor, Readable);
/*!
* ignore
*/
function _init(model, c, agg) {
if (!model.collection.buffer) {
model.hooks.execPre('aggregate', agg, function(err) {
if (err != null) {
_handlePreHookError(c, err);
return;
}
if (typeof agg.options?.cursor?.transform === 'function') {
c._transforms.push(agg.options.cursor.transform);
}
c.cursor = model.collection.aggregate(agg._pipeline, agg.options || {});
c.emit('cursor', c.cursor);
});
} else {
model.collection.emitter.once('queue', function() {
model.hooks.execPre('aggregate', agg, function(err) {
if (err != null) {
_handlePreHookError(c, err);
return;
}
if (typeof agg.options?.cursor?.transform === 'function') {
c._transforms.push(agg.options.cursor.transform);
}
c.cursor = model.collection.aggregate(agg._pipeline, agg.options || {});
c.emit('cursor', c.cursor);
});
});
}
}
/**
* Handles error emitted from pre middleware. In particular, checks for `skipWrappedFunction`, which allows skipping
* the actual aggregation and overwriting the function's return value. Because aggregation cursors don't return a value,
* we need to make sure the user doesn't accidentally set a value in skipWrappedFunction.
*
* @param {QueryCursor} queryCursor
* @param {Error} err
* @returns
*/
function _handlePreHookError(queryCursor, err) {
if (err instanceof kareem.skipWrappedFunction) {
const resultValue = err.args[0];
if (resultValue != null && (!Array.isArray(resultValue) || resultValue.length)) {
const err = new MongooseError(
'Cannot `skipMiddlewareFunction()` with a value when using ' +
'`.aggregate().cursor()`, value must be nullish or empty array, got "' +
util.inspect(resultValue) +
'".'
);
queryCursor._markError(err);
queryCursor.listeners('error').length > 0 && queryCursor.emit('error', err);
return;
}
queryCursor.emit('cursor', null);
return;
}
queryCursor._markError(err);
queryCursor.listeners('error').length > 0 && queryCursor.emit('error', err);
}
/**
* Necessary to satisfy the Readable API
* @method _read
* @memberOf AggregationCursor
* @instance
* @api private
*/
AggregationCursor.prototype._read = function() {
const _this = this;
_next(this, function(error, doc) {
if (error) {
return _this.emit('error', error);
}
if (!doc) {
_this.push(null);
_this.cursor.close(function(error) {
if (error) {
return _this.emit('error', error);
}
});
return;
}
_this.push(doc);
});
};
if (Symbol.asyncIterator != null) {
const msg = 'Mongoose does not support using async iterators with an ' +
'existing aggregation cursor. See https://bit.ly/mongoose-async-iterate-aggregation';
AggregationCursor.prototype[Symbol.asyncIterator] = function() {
throw new MongooseError(msg);
};
}
/**
* Registers a transform function which subsequently maps documents retrieved
* via the streams interface or `.next()`
*
* #### Example:
*
* // Map documents returned by `data` events
* Thing.
* find({ name: /^hello/ }).
* cursor().
* map(function (doc) {
* doc.foo = "bar";
* return doc;
* })
* on('data', function(doc) { console.log(doc.foo); });
*
* // Or map documents returned by `.next()`
* const cursor = Thing.find({ name: /^hello/ }).
* cursor().
* map(function (doc) {
* doc.foo = "bar";
* return doc;
* });
* cursor.next(function(error, doc) {
* console.log(doc.foo);
* });
*
* @param {Function} fn
* @return {AggregationCursor}
* @memberOf AggregationCursor
* @api public
* @method map
*/
Object.defineProperty(AggregationCursor.prototype, 'map', {
value: function(fn) {
this._transforms.push(fn);
return this;
},
enumerable: true,
configurable: true,
writable: true
});
/**
* Marks this cursor as errored
* @method _markError
* @instance
* @memberOf AggregationCursor
* @api private
*/
AggregationCursor.prototype._markError = function(error) {
this._error = error;
return this;
};
/**
* Marks this cursor as closed. Will stop streaming and subsequent calls to
* `next()` will error.
*
* @return {Promise}
* @api public
* @method close
* @emits "close"
* @see AggregationCursor.close https://mongodb.github.io/node-mongodb-native/4.9/classes/AggregationCursor.html#close
*/
AggregationCursor.prototype.close = async function close() {
if (typeof arguments[0] === 'function') {
throw new MongooseError('AggregationCursor.prototype.close() no longer accepts a callback');
}
try {
await this.cursor.close();
} catch (error) {
this.listeners('error').length > 0 && this.emit('error', error);
throw error;
}
this.emit('close');
};
/**
* Marks this cursor as destroyed. Will stop streaming and subsequent calls to
* `next()` will error.
*
* @return {this}
* @api private
* @method _destroy
*/
AggregationCursor.prototype._destroy = function _destroy(_err, callback) {
let waitForCursor = null;
if (!this.cursor) {
waitForCursor = new Promise((resolve) => {
this.once('cursor', resolve);
});
} else {
waitForCursor = Promise.resolve();
}
waitForCursor
.then(() => this.cursor.close())
.then(() => {
this._closed = true;
callback();
})
.catch(error => {
callback(error);
});
return this;
};
/**
* Get the next document from this cursor. Will return `null` when there are
* no documents left.
*
* @return {Promise}
* @api public
* @method next
*/
AggregationCursor.prototype.next = async function next() {
if (typeof arguments[0] === 'function') {
throw new MongooseError('AggregationCursor.prototype.next() no longer accepts a callback');
}
return new Promise((resolve, reject) => {
_next(this, (err, res) => {
if (err != null) {
return reject(err);
}
resolve(res);
});
});
};
/**
* Execute `fn` for every document in the cursor. If `fn` returns a promise,
* will wait for the promise to resolve before iterating on to the next one.
* Returns a promise that resolves when done.
*
* @param {Function} fn
* @param {Object} [options]
* @param {Number} [options.parallel] the number of promises to execute in parallel. Defaults to 1.
* @param {Number} [options.batchSize=null] if set, Mongoose will call `fn` with an array of at most `batchSize` documents, instead of a single document
* @param {Boolean} [options.continueOnError=false] if true, `eachAsync()` iterates through all docs even if `fn` throws an error. If false, `eachAsync()` throws an error immediately if the given function `fn()` throws an error.
* @return {Promise}
* @api public
* @method eachAsync
*/
AggregationCursor.prototype.eachAsync = function(fn, opts) {
if (typeof arguments[2] === 'function') {
throw new MongooseError('AggregationCursor.prototype.eachAsync() no longer accepts a callback');
}
const _this = this;
if (typeof opts === 'function') {
opts = {};
}
opts = opts || {};
return eachAsync(function(cb) { return _next(_this, cb); }, fn, opts);
};
/**
* Returns an asyncIterator for use with [`for/await/of` loops](https://thecodebarbarian.com/getting-started-with-async-iterators-in-node-js)
* You do not need to call this function explicitly, the JavaScript runtime
* will call it for you.
*
* #### Example:
*
* // Async iterator without explicitly calling `cursor()`. Mongoose still
* // creates an AggregationCursor instance internally.
* const agg = Model.aggregate([{ $match: { age: { $gte: 25 } } }]);
* for await (const doc of agg) {
* console.log(doc.name);
* }
*
* // You can also use an AggregationCursor instance for async iteration
* const cursor = Model.aggregate([{ $match: { age: { $gte: 25 } } }]).cursor();
* for await (const doc of cursor) {
* console.log(doc.name);
* }
*
* Node.js 10.x supports async iterators natively without any flags. You can
* enable async iterators in Node.js 8.x using the [`--harmony_async_iteration` flag](https://github.com/tc39/proposal-async-iteration/issues/117#issuecomment-346695187).
*
* **Note:** This function is not set if `Symbol.asyncIterator` is undefined. If
* `Symbol.asyncIterator` is undefined, that means your Node.js version does not
* support async iterators.
*
* @method [Symbol.asyncIterator]
* @memberOf AggregationCursor
* @instance
* @api public
*/
if (Symbol.asyncIterator != null) {
AggregationCursor.prototype[Symbol.asyncIterator] = function() {
return this.transformNull()._transformForAsyncIterator();
};
}
/*!
* ignore
*/
AggregationCursor.prototype._transformForAsyncIterator = function() {
if (this._transforms.indexOf(_transformForAsyncIterator) === -1) {
this.map(_transformForAsyncIterator);
}
return this;
};
/*!
* ignore
*/
AggregationCursor.prototype.transformNull = function(val) {
if (arguments.length === 0) {
val = true;
}
this._mongooseOptions.transformNull = val;
return this;
};
/*!
* ignore
*/
function _transformForAsyncIterator(doc) {
return doc == null ? { done: true } : { value: doc, done: false };
}
/**
* Adds a [cursor flag](https://mongodb.github.io/node-mongodb-native/4.9/classes/AggregationCursor.html#addCursorFlag).
* Useful for setting the `noCursorTimeout` and `tailable` flags.
*
* @param {String} flag
* @param {Boolean} value
* @return {AggregationCursor} this
* @api public
* @method addCursorFlag
*/
AggregationCursor.prototype.addCursorFlag = function(flag, value) {
const _this = this;
_waitForCursor(this, function() {
_this.cursor.addCursorFlag(flag, value);
});
return this;
};
/*!
* ignore
*/
function _waitForCursor(ctx, cb) {
if (ctx.cursor) {
return cb();
}
ctx.once('cursor', function() {
cb();
});
}
/**
* Get the next doc from the underlying cursor and mongooseify it
* (populate, etc.)
* @param {Any} ctx
* @param {Function} cb
* @api private
*/
function _next(ctx, cb) {
let callback = cb;
if (ctx._transforms.length) {
callback = function(err, doc) {
if (err || (doc === null && !ctx._mongooseOptions.transformNull)) {
return cb(err, doc);
}
cb(err, ctx._transforms.reduce(function(doc, fn) {
return fn(doc);
}, doc));
};
}
if (ctx._error) {
return immediate(function() {
callback(ctx._error);
});
}
if (ctx.cursor) {
return ctx.cursor.next().then(
doc => {
if (!doc) {
return callback(null, null);
}
callback(null, doc);
},
err => callback(err)
);
} else {
ctx.once('error', cb);
ctx.once('cursor', function() {
_next(ctx, cb);
});
}
}
module.exports = AggregationCursor;

View file

@ -0,0 +1,198 @@
'use strict';
/*!
* Module dependencies.
*/
const EventEmitter = require('events').EventEmitter;
const MongooseError = require('../error/mongooseError');
/*!
* ignore
*/
const driverChangeStreamEvents = ['close', 'change', 'end', 'error', 'resumeTokenChanged'];
/*!
* ignore
*/
class ChangeStream extends EventEmitter {
constructor(changeStreamThunk, pipeline, options) {
super();
this.driverChangeStream = null;
this.closed = false;
this.bindedEvents = false;
this.pipeline = pipeline;
this.options = options;
this.errored = false;
if (options && options.hydrate && !options.model) {
throw new Error(
'Cannot create change stream with `hydrate: true` ' +
'unless calling `Model.watch()`'
);
}
let syncError = null;
this.$driverChangeStreamPromise = new Promise((resolve, reject) => {
// This wrapper is necessary because of buffering.
try {
changeStreamThunk((err, driverChangeStream) => {
if (err != null) {
this.errored = true;
this.emit('error', err);
return reject(err);
}
this.driverChangeStream = driverChangeStream;
this.emit('ready');
resolve();
});
} catch (err) {
syncError = err;
this.errored = true;
this.emit('error', err);
reject(err);
}
});
// Because a ChangeStream is an event emitter, there's no way to register an 'error' handler
// that catches errors which occur in the constructor, unless we force sync errors into async
// errors with setImmediate(). For cleaner stack trace, we just immediately throw any synchronous
// errors that occurred with changeStreamThunk().
if (syncError != null) {
throw syncError;
}
}
_bindEvents() {
if (this.bindedEvents) {
return;
}
this.bindedEvents = true;
if (this.driverChangeStream == null) {
this.$driverChangeStreamPromise.then(
() => {
this.driverChangeStream.on('close', () => {
this.closed = true;
});
driverChangeStreamEvents.forEach(ev => {
this.driverChangeStream.on(ev, data => {
if (data != null && data.fullDocument != null && this.options && this.options.hydrate) {
data.fullDocument = this.options.model.hydrate(data.fullDocument);
}
this.emit(ev, data);
});
});
},
() => {} // No need to register events if opening change stream failed
);
return;
}
this.driverChangeStream.on('close', () => {
this.closed = true;
});
driverChangeStreamEvents.forEach(ev => {
this.driverChangeStream.on(ev, data => {
if (data != null && data.fullDocument != null && this.options && this.options.hydrate) {
data.fullDocument = this.options.model.hydrate(data.fullDocument);
}
this.emit(ev, data);
});
});
}
hasNext(cb) {
if (this.errored) {
throw new MongooseError('Cannot call hasNext() on errored ChangeStream');
}
return this.driverChangeStream.hasNext(cb);
}
next(cb) {
if (this.errored) {
throw new MongooseError('Cannot call next() on errored ChangeStream');
}
if (this.options && this.options.hydrate) {
if (cb != null) {
const originalCb = cb;
cb = (err, data) => {
if (err != null) {
return originalCb(err);
}
if (data.fullDocument != null) {
data.fullDocument = this.options.model.hydrate(data.fullDocument);
}
return originalCb(null, data);
};
}
let maybePromise = this.driverChangeStream.next(cb);
if (maybePromise && typeof maybePromise.then === 'function') {
maybePromise = maybePromise.then(data => {
if (data.fullDocument != null) {
data.fullDocument = this.options.model.hydrate(data.fullDocument);
}
return data;
});
}
return maybePromise;
}
return this.driverChangeStream.next(cb);
}
addListener(event, handler) {
if (this.errored) {
throw new MongooseError('Cannot call addListener() on errored ChangeStream');
}
this._bindEvents();
return super.addListener(event, handler);
}
on(event, handler) {
if (this.errored) {
throw new MongooseError('Cannot call on() on errored ChangeStream');
}
this._bindEvents();
return super.on(event, handler);
}
once(event, handler) {
if (this.errored) {
throw new MongooseError('Cannot call once() on errored ChangeStream');
}
this._bindEvents();
return super.once(event, handler);
}
_queue(cb) {
this.once('ready', () => cb());
}
close() {
this.closed = true;
if (this.driverChangeStream) {
return this.driverChangeStream.close();
} else {
return this.$driverChangeStreamPromise.then(
() => this.driverChangeStream.close(),
() => {} // No need to close if opening the change stream failed
);
}
}
}
/*!
* ignore
*/
module.exports = ChangeStream;

View file

@ -0,0 +1,628 @@
/*!
* Module dependencies.
*/
'use strict';
const MongooseError = require('../error/mongooseError');
const Readable = require('stream').Readable;
const eachAsync = require('../helpers/cursor/eachAsync');
const helpers = require('../queryHelpers');
const kareem = require('kareem');
const immediate = require('../helpers/immediate');
const { once } = require('events');
const util = require('util');
/**
* A QueryCursor is a concurrency primitive for processing query results
* one document at a time. A QueryCursor fulfills the Node.js streams3 API,
* in addition to several other mechanisms for loading documents from MongoDB
* one at a time.
*
* QueryCursors execute the model's pre `find` hooks before loading any documents
* from MongoDB, and the model's post `find` hooks after loading each document.
*
* Unless you're an advanced user, do **not** instantiate this class directly.
* Use [`Query#cursor()`](https://mongoosejs.com/docs/api/query.html#Query.prototype.cursor()) instead.
*
* @param {Query} query
* @param {Object} options query options passed to `.find()`
* @inherits Readable https://nodejs.org/api/stream.html#class-streamreadable
* @event `cursor`: Emitted when the cursor is created
* @event `error`: Emitted when an error occurred
* @event `data`: Emitted when the stream is flowing and the next doc is ready
* @event `end`: Emitted when the stream is exhausted
* @api public
*/
function QueryCursor(query) {
// set autoDestroy=true because on node 12 it's by default false
// gh-10902 need autoDestroy to destroy correctly and emit 'close' event
Readable.call(this, { autoDestroy: true, objectMode: true });
this.cursor = null;
this.skipped = false;
this.query = query;
this._closed = false;
const model = query.model;
this._mongooseOptions = {};
this._transforms = [];
this.model = model;
this.options = {};
model.hooks.execPre('find', query, (err) => {
if (err != null) {
if (err instanceof kareem.skipWrappedFunction) {
const resultValue = err.args[0];
if (resultValue != null && (!Array.isArray(resultValue) || resultValue.length)) {
const err = new MongooseError(
'Cannot `skipMiddlewareFunction()` with a value when using ' +
'`.find().cursor()`, value must be nullish or empty array, got "' +
util.inspect(resultValue) +
'".'
);
this._markError(err);
this.listeners('error').length > 0 && this.emit('error', err);
return;
}
this.skipped = true;
this.emit('cursor', null);
return;
}
this._markError(err);
this.listeners('error').length > 0 && this.emit('error', err);
return;
}
Object.assign(this.options, query._optionsForExec());
this._transforms = this._transforms.concat(query._transforms.slice());
if (this.options.transform) {
this._transforms.push(this.options.transform);
}
// Re: gh-8039, you need to set the `cursor.batchSize` option, top-level
// `batchSize` option doesn't work.
if (this.options.batchSize) {
// Max out the number of documents we'll populate in parallel at 5000.
this.options._populateBatchSize = Math.min(this.options.batchSize, 5000);
}
if (query._mongooseOptions._asyncIterator) {
this._mongooseOptions._asyncIterator = true;
}
if (model.collection._shouldBufferCommands() && model.collection.buffer) {
model.collection.queue.push([
() => _getRawCursor(query, this)
]);
} else {
_getRawCursor(query, this);
}
});
}
util.inherits(QueryCursor, Readable);
/*!
* ignore
*/
function _getRawCursor(query, queryCursor) {
try {
const cursor = query.model.collection.find(query._conditions, queryCursor.options);
queryCursor.cursor = cursor;
queryCursor.emit('cursor', cursor);
} catch (err) {
queryCursor._markError(err);
queryCursor.listeners('error').length > 0 && queryCursor.emit('error', queryCursor._error);
}
}
/**
* Necessary to satisfy the Readable API
* @method _read
* @memberOf QueryCursor
* @instance
* @api private
*/
QueryCursor.prototype._read = function() {
_next(this, (error, doc) => {
if (error) {
return this.emit('error', error);
}
if (!doc) {
this.push(null);
this.cursor.close(function(error) {
if (error) {
return this.emit('error', error);
}
});
return;
}
this.push(doc);
});
};
/**
* Returns the underlying cursor from the MongoDB Node driver that this cursor uses.
*
* @method getDriverCursor
* @memberOf QueryCursor
* @returns {Cursor} MongoDB Node driver cursor instance
* @instance
* @api public
*/
QueryCursor.prototype.getDriverCursor = async function getDriverCursor() {
if (this.cursor) {
return this.cursor;
}
await once(this, 'cursor');
return this.cursor;
};
/**
* Registers a transform function which subsequently maps documents retrieved
* via the streams interface or `.next()`
*
* #### Example:
*
* // Map documents returned by `data` events
* Thing.
* find({ name: /^hello/ }).
* cursor().
* map(function (doc) {
* doc.foo = "bar";
* return doc;
* })
* on('data', function(doc) { console.log(doc.foo); });
*
* // Or map documents returned by `.next()`
* const cursor = Thing.find({ name: /^hello/ }).
* cursor().
* map(function (doc) {
* doc.foo = "bar";
* return doc;
* });
* cursor.next(function(error, doc) {
* console.log(doc.foo);
* });
*
* @param {Function} fn
* @return {QueryCursor}
* @memberOf QueryCursor
* @api public
* @method map
*/
Object.defineProperty(QueryCursor.prototype, 'map', {
value: function(fn) {
this._transforms.push(fn);
return this;
},
enumerable: true,
configurable: true,
writable: true
});
/**
* Marks this cursor as errored
* @method _markError
* @memberOf QueryCursor
* @instance
* @api private
*/
QueryCursor.prototype._markError = function(error) {
this._error = error;
return this;
};
/**
* Marks this cursor as closed. Will stop streaming and subsequent calls to
* `next()` will error.
*
* @return {Promise}
* @api public
* @method close
* @emits close
* @see AggregationCursor.close https://mongodb.github.io/node-mongodb-native/4.9/classes/AggregationCursor.html#close
*/
QueryCursor.prototype.close = async function close() {
if (typeof arguments[0] === 'function') {
throw new MongooseError('QueryCursor.prototype.close() no longer accepts a callback');
}
try {
await this.cursor.close();
this._closed = true;
this.emit('close');
} catch (error) {
this.listeners('error').length > 0 && this.emit('error', error);
throw error;
}
};
/**
* Marks this cursor as destroyed. Will stop streaming and subsequent calls to
* `next()` will error.
*
* @return {this}
* @api private
* @method _destroy
*/
QueryCursor.prototype._destroy = function _destroy(_err, callback) {
let waitForCursor = null;
if (!this.cursor) {
waitForCursor = new Promise((resolve) => {
this.once('cursor', resolve);
});
} else {
waitForCursor = Promise.resolve();
}
waitForCursor
.then(() => {
this.cursor.close();
})
.then(() => {
this._closed = true;
callback();
})
.catch(error => {
callback(error);
});
return this;
};
/**
* Rewind this cursor to its uninitialized state. Any options that are present on the cursor will
* remain in effect. Iterating this cursor will cause new queries to be sent to the server, even
* if the resultant data has already been retrieved by this cursor.
*
* @return {AggregationCursor} this
* @api public
* @method rewind
*/
QueryCursor.prototype.rewind = function() {
_waitForCursor(this, () => {
this.cursor.rewind();
});
return this;
};
/**
* Get the next document from this cursor. Will return `null` when there are
* no documents left.
*
* @return {Promise}
* @api public
* @method next
*/
QueryCursor.prototype.next = async function next() {
if (typeof arguments[0] === 'function') {
throw new MongooseError('QueryCursor.prototype.next() no longer accepts a callback');
}
if (this._closed) {
throw new MongooseError('Cannot call `next()` on a closed cursor');
}
return new Promise((resolve, reject) => {
_next(this, function(error, doc) {
if (error) {
return reject(error);
}
resolve(doc);
});
});
};
/**
* Execute `fn` for every document in the cursor. If `fn` returns a promise,
* will wait for the promise to resolve before iterating on to the next one.
* Returns a promise that resolves when done.
*
* #### Example:
*
* // Iterate over documents asynchronously
* Thing.
* find({ name: /^hello/ }).
* cursor().
* eachAsync(async function (doc, i) {
* doc.foo = doc.bar + i;
* await doc.save();
* })
*
* @param {Function} fn
* @param {Object} [options]
* @param {Number} [options.parallel] the number of promises to execute in parallel. Defaults to 1.
* @param {Number} [options.batchSize] if set, will call `fn()` with arrays of documents with length at most `batchSize`
* @param {Boolean} [options.continueOnError=false] if true, `eachAsync()` iterates through all docs even if `fn` throws an error. If false, `eachAsync()` throws an error immediately if the given function `fn()` throws an error.
* @return {Promise}
* @api public
* @method eachAsync
*/
QueryCursor.prototype.eachAsync = function(fn, opts) {
if (typeof arguments[2] === 'function') {
throw new MongooseError('QueryCursor.prototype.eachAsync() no longer accepts a callback');
}
if (typeof opts === 'function') {
opts = {};
}
opts = opts || {};
return eachAsync((cb) => _next(this, cb), fn, opts);
};
/**
* The `options` passed in to the `QueryCursor` constructor.
*
* @api public
* @property options
*/
QueryCursor.prototype.options;
/**
* Adds a [cursor flag](https://mongodb.github.io/node-mongodb-native/4.9/classes/FindCursor.html#addCursorFlag).
* Useful for setting the `noCursorTimeout` and `tailable` flags.
*
* @param {String} flag
* @param {Boolean} value
* @return {AggregationCursor} this
* @api public
* @method addCursorFlag
*/
QueryCursor.prototype.addCursorFlag = function(flag, value) {
_waitForCursor(this, () => {
this.cursor.addCursorFlag(flag, value);
});
return this;
};
/**
* Returns an asyncIterator for use with [`for/await/of` loops](https://thecodebarbarian.com/getting-started-with-async-iterators-in-node-js).
* You do not need to call this function explicitly, the JavaScript runtime
* will call it for you.
*
* #### Example:
*
* // Works without using `cursor()`
* for await (const doc of Model.find([{ $sort: { name: 1 } }])) {
* console.log(doc.name);
* }
*
* // Can also use `cursor()`
* for await (const doc of Model.find([{ $sort: { name: 1 } }]).cursor()) {
* console.log(doc.name);
* }
*
* Node.js 10.x supports async iterators natively without any flags. You can
* enable async iterators in Node.js 8.x using the [`--harmony_async_iteration` flag](https://github.com/tc39/proposal-async-iteration/issues/117#issuecomment-346695187).
*
* **Note:** This function is not if `Symbol.asyncIterator` is undefined. If
* `Symbol.asyncIterator` is undefined, that means your Node.js version does not
* support async iterators.
*
* @method [Symbol.asyncIterator]
* @memberOf QueryCursor
* @instance
* @api public
*/
if (Symbol.asyncIterator != null) {
QueryCursor.prototype[Symbol.asyncIterator] = function queryCursorAsyncIterator() {
// Set so QueryCursor knows it should transform results for async iterators into `{ value, done }` syntax
this._mongooseOptions._asyncIterator = true;
return this;
};
}
/**
* Get the next doc from the underlying cursor and mongooseify it
* (populate, etc.)
* @param {Any} ctx
* @param {Function} cb
* @api private
*/
function _next(ctx, cb) {
let callback = cb;
// Create a custom callback to handle transforms, async iterator, and transformNull
callback = function(err, doc) {
if (err) {
return cb(err);
}
// Handle null documents - if asyncIterator, we need to return `done: true`, otherwise just
// skip. In either case, avoid transforms.
if (doc === null) {
if (ctx._mongooseOptions._asyncIterator) {
return cb(null, { done: true });
} else {
return cb(null, null);
}
}
// Apply transforms
if (ctx._transforms.length && doc !== null) {
doc = ctx._transforms.reduce(function(doc, fn) {
return fn.call(ctx, doc);
}, doc);
}
// This option is set in `Symbol.asyncIterator` code paths.
// For async iterator, we need to convert to {value, done} format
if (ctx._mongooseOptions._asyncIterator) {
return cb(null, { value: doc, done: false });
}
return cb(null, doc);
};
if (ctx._error) {
return immediate(function() {
callback(ctx._error);
});
}
if (ctx.skipped) {
return immediate(() => callback(null, null));
}
if (ctx.cursor) {
if (ctx.query._mongooseOptions.populate && !ctx._pop) {
ctx._pop = helpers.preparePopulationOptionsMQ(ctx.query,
ctx.query._mongooseOptions);
ctx._pop.__noPromise = true;
}
if (ctx.query._mongooseOptions.populate && ctx.options._populateBatchSize > 1) {
if (ctx._batchDocs && ctx._batchDocs.length) {
// Return a cached populated doc
return _nextDoc(ctx, ctx._batchDocs.shift(), ctx._pop, callback);
} else if (ctx._batchExhausted) {
// Internal cursor reported no more docs. Act the same here
return callback(null, null);
} else {
// Request as many docs as batchSize, to populate them also in batch
ctx._batchDocs = [];
ctx.cursor.next().then(
res => { _onNext.call({ ctx, callback }, null, res); },
err => { _onNext.call({ ctx, callback }, err); }
);
return;
}
} else {
return ctx.cursor.next().then(
doc => {
if (!doc) {
callback(null, null);
return;
}
if (!ctx.query._mongooseOptions.populate) {
return _nextDoc(ctx, doc, null, callback);
}
ctx.query.model.populate(doc, ctx._pop).then(
doc => {
_nextDoc(ctx, doc, ctx._pop, callback);
},
err => {
callback(err);
}
);
},
error => {
callback(error);
}
);
}
} else {
ctx.once('error', cb);
ctx.once('cursor', function(cursor) {
ctx.removeListener('error', cb);
if (cursor == null) {
if (ctx.skipped) {
return cb(null, null);
}
return;
}
_next(ctx, cb);
});
}
}
/*!
* ignore
*/
function _onNext(error, doc) {
if (error) {
return this.callback(error);
}
if (!doc) {
this.ctx._batchExhausted = true;
return _populateBatch.call(this);
}
this.ctx._batchDocs.push(doc);
if (this.ctx._batchDocs.length < this.ctx.options._populateBatchSize) {
// If both `batchSize` and `_populateBatchSize` are huge, calling `next()` repeatedly may
// cause a stack overflow. So make sure we clear the stack.
immediate(() => this.ctx.cursor.next().then(
res => { _onNext.call(this, null, res); },
err => { _onNext.call(this, err); }
));
} else {
_populateBatch.call(this);
}
}
/*!
* ignore
*/
function _populateBatch() {
if (!this.ctx._batchDocs.length) {
return this.callback(null, null);
}
this.ctx.query.model.populate(this.ctx._batchDocs, this.ctx._pop).then(
() => {
_nextDoc(this.ctx, this.ctx._batchDocs.shift(), this.ctx._pop, this.callback);
},
err => {
this.callback(err);
}
);
}
/*!
* ignore
*/
function _nextDoc(ctx, doc, pop, callback) {
if (ctx.query._mongooseOptions.lean) {
return ctx.model.hooks.execPost('find', ctx.query, [[doc]], err => {
if (err != null) {
return callback(err);
}
callback(null, doc);
});
}
const { model, _fields, _userProvidedFields, options } = ctx.query;
helpers.createModelAndInit(model, doc, _fields, _userProvidedFields, options, pop, (err, doc) => {
if (err != null) {
return callback(err);
}
ctx.model.hooks.execPost('find', ctx.query, [[doc]], err => {
if (err != null) {
return callback(err);
}
callback(null, doc);
});
});
}
/*!
* ignore
*/
function _waitForCursor(ctx, cb) {
if (ctx.cursor) {
return cb();
}
ctx.once('cursor', function(cursor) {
if (cursor == null) {
return;
}
cb();
});
}
module.exports = QueryCursor;

File diff suppressed because it is too large Load diff

View file

@ -0,0 +1,30 @@
'use strict';
/* eslint-env browser */
/*!
* Module dependencies.
*/
const Document = require('./document.js');
const BrowserDocument = require('./browserDocument.js');
let isBrowser = false;
/**
* Returns the Document constructor for the current context
*
* @api private
*/
module.exports = function documentProvider() {
if (isBrowser) {
return BrowserDocument;
}
return Document;
};
/*!
* ignore
*/
module.exports.setBrowser = function(flag) {
isBrowser = flag;
};

View file

@ -0,0 +1,15 @@
'use strict';
/*!
* ignore
*/
let driver = null;
module.exports.get = function() {
return driver;
};
module.exports.set = function(v) {
driver = v;
};

View file

@ -0,0 +1,4 @@
# Driver Spec
TODO

View file

@ -0,0 +1,14 @@
/*!
* Module dependencies.
*/
'use strict';
const Binary = require('bson').Binary;
/*!
* Module exports.
*/
module.exports = exports = Binary;

View file

@ -0,0 +1,7 @@
/*!
* ignore
*/
'use strict';
module.exports = require('bson').Decimal128;

View file

@ -0,0 +1,13 @@
/*!
* Module exports.
*/
'use strict';
exports.Collection = function() {
throw new Error('Cannot create a collection from browser library');
};
exports.Connection = function() {
throw new Error('Cannot create a connection from browser library');
};
exports.BulkWriteResult = function() {};

View file

@ -0,0 +1,29 @@
/*!
* [node-mongodb-native](https://github.com/mongodb/node-mongodb-native) ObjectId
* @constructor NodeMongoDbObjectId
* @see ObjectId
*/
'use strict';
const ObjectId = require('bson').ObjectID;
/**
* Getter for convenience with populate, see gh-6115
* @api private
*/
Object.defineProperty(ObjectId.prototype, '_id', {
enumerable: false,
configurable: true,
get: function() {
return this;
}
});
/*!
* ignore
*/
module.exports = exports = ObjectId;

View file

@ -0,0 +1,5 @@
'use strict';
const BulkWriteResult = require('mongodb/lib/bulk/common').BulkWriteResult;
module.exports = BulkWriteResult;

View file

@ -0,0 +1,471 @@
'use strict';
/*!
* Module dependencies.
*/
const MongooseCollection = require('../../collection');
const MongooseError = require('../../error/mongooseError');
const Collection = require('mongodb').Collection;
const ObjectId = require('../../types/objectid');
const getConstructorName = require('../../helpers/getConstructorName');
const internalToObjectOptions = require('../../options').internalToObjectOptions;
const stream = require('stream');
const util = require('util');
const formatToObjectOptions = Object.freeze({ ...internalToObjectOptions, copyTrustedSymbol: false });
/**
* A [node-mongodb-native](https://github.com/mongodb/node-mongodb-native) collection implementation.
*
* All methods methods from the [node-mongodb-native](https://github.com/mongodb/node-mongodb-native) driver are copied and wrapped in queue management.
*
* @inherits Collection https://mongodb.github.io/node-mongodb-native/4.9/classes/Collection.html
* @api private
*/
function NativeCollection(name, conn, options) {
this.collection = null;
this.Promise = options.Promise || Promise;
this.modelName = options.modelName;
delete options.modelName;
this._closed = false;
MongooseCollection.apply(this, arguments);
}
/*!
* Inherit from abstract Collection.
*/
Object.setPrototypeOf(NativeCollection.prototype, MongooseCollection.prototype);
/**
* Called when the connection opens.
*
* @api private
*/
NativeCollection.prototype.onOpen = function() {
this.collection = this.conn.db.collection(this.name);
MongooseCollection.prototype.onOpen.call(this);
return this.collection;
};
/**
* Called when the connection closes
*
* @api private
*/
NativeCollection.prototype.onClose = function(force) {
MongooseCollection.prototype.onClose.call(this, force);
};
/**
* Helper to get the collection, in case `this.collection` isn't set yet.
* May happen if `bufferCommands` is false and created the model when
* Mongoose was disconnected.
*
* @api private
*/
NativeCollection.prototype._getCollection = function _getCollection() {
if (this.collection) {
return this.collection;
}
if (this.conn.db != null) {
this.collection = this.conn.db.collection(this.name);
return this.collection;
}
return null;
};
/*!
* ignore
*/
const syncCollectionMethods = { watch: true, find: true, aggregate: true };
/**
* Copy the collection methods and make them subject to queues
* @param {Number|String} I
* @api private
*/
function iter(i) {
NativeCollection.prototype[i] = function() {
const collection = this._getCollection();
const args = Array.from(arguments);
const _this = this;
const globalDebug = _this &&
_this.conn &&
_this.conn.base &&
_this.conn.base.options &&
_this.conn.base.options.debug;
const connectionDebug = _this &&
_this.conn &&
_this.conn.options &&
_this.conn.options.debug;
const debug = connectionDebug == null ? globalDebug : connectionDebug;
const lastArg = arguments[arguments.length - 1];
const opId = new ObjectId();
// If user force closed, queueing will hang forever. See #5664
if (this.conn.$wasForceClosed) {
const error = new MongooseError('Connection was force closed');
if (args.length > 0 &&
typeof args[args.length - 1] === 'function') {
args[args.length - 1](error);
return;
} else {
throw error;
}
}
let _args = args;
let callback = null;
if (this._shouldBufferCommands() && this.buffer) {
this.conn.emit('buffer', {
_id: opId,
modelName: _this.modelName,
collectionName: _this.name,
method: i,
args: args
});
let callback;
let _args = args;
let promise = null;
let timeout = null;
if (syncCollectionMethods[i] && typeof lastArg === 'function') {
this.addQueue(i, _args);
callback = lastArg;
} else if (syncCollectionMethods[i]) {
promise = new this.Promise((resolve, reject) => {
callback = function collectionOperationCallback(err, res) {
if (timeout != null) {
clearTimeout(timeout);
}
if (err != null) {
return reject(err);
}
resolve(res);
};
_args = args.concat([callback]);
this.addQueue(i, _args);
});
} else if (typeof lastArg === 'function') {
callback = function collectionOperationCallback() {
if (timeout != null) {
clearTimeout(timeout);
}
return lastArg.apply(this, arguments);
};
_args = args.slice(0, args.length - 1).concat([callback]);
} else {
promise = new Promise((resolve, reject) => {
callback = function collectionOperationCallback(err, res) {
if (timeout != null) {
clearTimeout(timeout);
}
if (err != null) {
return reject(err);
}
resolve(res);
};
_args = args.concat([callback]);
this.addQueue(i, _args);
});
}
const bufferTimeoutMS = this._getBufferTimeoutMS();
timeout = setTimeout(() => {
const removed = this.removeQueue(i, _args);
if (removed) {
const message = 'Operation `' + this.name + '.' + i + '()` buffering timed out after ' +
bufferTimeoutMS + 'ms';
const err = new MongooseError(message);
this.conn.emit('buffer-end', { _id: opId, modelName: _this.modelName, collectionName: _this.name, method: i, error: err });
callback(err);
}
}, bufferTimeoutMS);
if (!syncCollectionMethods[i] && typeof lastArg === 'function') {
this.addQueue(i, _args);
return;
}
return promise;
} else if (!syncCollectionMethods[i] && typeof lastArg === 'function') {
callback = function collectionOperationCallback(err, res) {
if (err != null) {
_this.conn.emit('operation-end', { _id: opId, modelName: _this.modelName, collectionName: _this.name, method: i, error: err });
} else {
_this.conn.emit('operation-end', { _id: opId, modelName: _this.modelName, collectionName: _this.name, method: i, result: res });
}
return lastArg.apply(this, arguments);
};
_args = args.slice(0, args.length - 1).concat([callback]);
}
if (debug) {
if (typeof debug === 'function') {
let argsToAdd = null;
if (typeof args[args.length - 1] == 'function') {
argsToAdd = args.slice(0, args.length - 1);
} else {
argsToAdd = args;
}
debug.apply(_this,
[_this.name, i].concat(argsToAdd));
} else if (debug instanceof stream.Writable) {
this.$printToStream(_this.name, i, args, debug);
} else {
const color = debug.color == null ? true : debug.color;
const shell = debug.shell == null ? false : debug.shell;
this.$print(_this.name, i, args, color, shell);
}
}
this.conn.emit('operation-start', { _id: opId, modelName: _this.modelName, collectionName: this.name, method: i, params: _args });
try {
if (collection == null) {
const message = 'Cannot call `' + this.name + '.' + i + '()` before initial connection ' +
'is complete if `bufferCommands = false`. Make sure you `await mongoose.connect()` if ' +
'you have `bufferCommands = false`.';
throw new MongooseError(message);
}
if (syncCollectionMethods[i] && typeof lastArg === 'function') {
const result = collection[i].apply(collection, _args.slice(0, _args.length - 1));
this.conn.emit('operation-end', { _id: opId, modelName: _this.modelName, collectionName: this.name, method: i, result });
return lastArg.call(this, null, result);
}
const ret = collection[i].apply(collection, _args);
if (ret != null && typeof ret.then === 'function') {
return ret.then(
result => {
if (typeof lastArg === 'function') {
lastArg(null, result);
} else {
this.conn.emit('operation-end', { _id: opId, modelName: _this.modelName, collectionName: this.name, method: i, result });
}
return result;
},
error => {
if (typeof lastArg === 'function') {
lastArg(error);
return;
} else {
this.conn.emit('operation-end', { _id: opId, modelName: _this.modelName, collectionName: this.name, method: i, error });
}
throw error;
}
);
}
return ret;
} catch (error) {
// Collection operation may throw because of max bson size, catch it here
// See gh-3906
if (typeof lastArg === 'function') {
return lastArg(error);
} else {
this.conn.emit('operation-end', { _id: opId, modelName: _this.modelName, collectionName: this.name, method: i, error: error });
throw error;
}
}
};
}
for (const key of Object.getOwnPropertyNames(Collection.prototype)) {
// Janky hack to work around gh-3005 until we can get rid of the mongoose
// collection abstraction
const descriptor = Object.getOwnPropertyDescriptor(Collection.prototype, key);
// Skip properties with getters because they may throw errors (gh-8528)
if (descriptor.get !== undefined) {
continue;
}
if (typeof Collection.prototype[key] !== 'function') {
continue;
}
iter(key);
}
/**
* Debug print helper
*
* @api public
* @method $print
*/
NativeCollection.prototype.$print = function(name, i, args, color, shell) {
const moduleName = color ? '\x1B[0;36mMongoose:\x1B[0m ' : 'Mongoose: ';
const functionCall = [name, i].join('.');
const _args = [];
for (let j = args.length - 1; j >= 0; --j) {
if (this.$format(args[j]) || _args.length) {
_args.unshift(this.$format(args[j], color, shell));
}
}
const params = '(' + _args.join(', ') + ')';
console.info(moduleName + functionCall + params);
};
/**
* Debug print helper
*
* @api public
* @method $print
*/
NativeCollection.prototype.$printToStream = function(name, i, args, stream) {
const functionCall = [name, i].join('.');
const _args = [];
for (let j = args.length - 1; j >= 0; --j) {
if (this.$format(args[j]) || _args.length) {
_args.unshift(this.$format(args[j]));
}
}
const params = '(' + _args.join(', ') + ')';
stream.write(functionCall + params, 'utf8');
};
/**
* Formatter for debug print args
*
* @api public
* @method $format
*/
NativeCollection.prototype.$format = function(arg, color, shell) {
const type = typeof arg;
if (type === 'function' || type === 'undefined') return '';
return format(arg, false, color, shell);
};
/**
* Debug print helper
* @param {Any} representation
* @api private
*/
function inspectable(representation) {
const ret = {
inspect: function() { return representation; }
};
if (util.inspect.custom) {
ret[util.inspect.custom] = ret.inspect;
}
return ret;
}
function map(o) {
return format(o, true);
}
function formatObjectId(x, key) {
x[key] = inspectable('ObjectId("' + x[key].toHexString() + '")');
}
function formatDate(x, key, shell) {
if (shell) {
x[key] = inspectable('ISODate("' + x[key].toUTCString() + '")');
} else {
x[key] = inspectable('new Date("' + x[key].toUTCString() + '")');
}
}
function format(obj, sub, color, shell) {
if (obj && typeof obj.toBSON === 'function') {
obj = obj.toBSON();
}
if (obj == null) {
return obj;
}
const clone = require('../../helpers/clone');
// `sub` indicates `format()` was called recursively, so skip cloning because we already
// did a deep clone on the top-level object.
let x = sub ? obj : clone(obj, formatToObjectOptions);
const constructorName = getConstructorName(x);
if (constructorName === 'Binary') {
x = 'BinData(' + x.sub_type + ', "' + x.toString('base64') + '")';
} else if (constructorName === 'ObjectId') {
x = inspectable('ObjectId("' + x.toHexString() + '")');
} else if (constructorName === 'Date') {
x = inspectable('new Date("' + x.toUTCString() + '")');
} else if (constructorName === 'Object') {
const keys = Object.keys(x);
const numKeys = keys.length;
let key;
for (let i = 0; i < numKeys; ++i) {
key = keys[i];
if (x[key]) {
let error;
if (typeof x[key].toBSON === 'function') {
try {
// `session.toBSON()` throws an error. This means we throw errors
// in debug mode when using transactions, see gh-6712. As a
// workaround, catch `toBSON()` errors, try to serialize without
// `toBSON()`, and rethrow if serialization still fails.
x[key] = x[key].toBSON();
} catch (_error) {
error = _error;
}
}
const _constructorName = getConstructorName(x[key]);
if (_constructorName === 'Binary') {
x[key] = 'BinData(' + x[key].sub_type + ', "' +
x[key].buffer.toString('base64') + '")';
} else if (_constructorName === 'Object') {
x[key] = format(x[key], true);
} else if (_constructorName === 'ObjectId') {
formatObjectId(x, key);
} else if (_constructorName === 'Date') {
formatDate(x, key, shell);
} else if (_constructorName === 'ClientSession') {
x[key] = inspectable('ClientSession("' +
(
x[key] &&
x[key].id &&
x[key].id.id &&
x[key].id.id.buffer || ''
).toString('hex') + '")');
} else if (Array.isArray(x[key])) {
x[key] = x[key].map(map);
} else if (error != null) {
// If there was an error with `toBSON()` and the object wasn't
// already converted to a string representation, rethrow it.
// Open to better ideas on how to handle this.
throw error;
}
}
}
}
if (sub) {
return x;
}
return util.
inspect(x, false, 10, color).
replace(/\n/g, '').
replace(/\s{2,}/g, ' ');
}
/**
* Retrieves information about this collections indexes.
*
* @method getIndexes
* @api public
*/
NativeCollection.prototype.getIndexes = NativeCollection.prototype.indexInformation;
/*!
* Module exports.
*/
module.exports = NativeCollection;

View file

@ -0,0 +1,525 @@
/*!
* Module dependencies.
*/
'use strict';
const MongooseConnection = require('../../connection');
const MongooseError = require('../../error/index');
const STATES = require('../../connectionState');
const mongodb = require('mongodb');
const pkg = require('../../../package.json');
const processConnectionOptions = require('../../helpers/processConnectionOptions');
const setTimeout = require('../../helpers/timers').setTimeout;
const utils = require('../../utils');
const Schema = require('../../schema');
/**
* A [node-mongodb-native](https://github.com/mongodb/node-mongodb-native) connection implementation.
*
* @inherits Connection
* @api private
*/
function NativeConnection() {
MongooseConnection.apply(this, arguments);
this._listening = false;
// Tracks the last time (as unix timestamp) the connection received a
// serverHeartbeatSucceeded or serverHeartbeatFailed event from the underlying MongoClient.
// If we haven't received one in a while (like due to a frozen AWS Lambda container) then
// `readyState` is likely stale.
this._lastHeartbeatAt = null;
}
/**
* Expose the possible connection states.
* @api public
*/
NativeConnection.STATES = STATES;
/*!
* Inherits from Connection.
*/
Object.setPrototypeOf(NativeConnection.prototype, MongooseConnection.prototype);
/**
* Switches to a different database using the same connection pool.
*
* Returns a new connection object, with the new db. If you set the `useCache`
* option, `useDb()` will cache connections by `name`.
*
* **Note:** Calling `close()` on a `useDb()` connection will close the base connection as well.
*
* @param {String} name The database name
* @param {Object} [options]
* @param {Boolean} [options.useCache=false] If true, cache results so calling `useDb()` multiple times with the same name only creates 1 connection object.
* @param {Boolean} [options.noListener=false] If true, the new connection object won't listen to any events on the base connection. This is better for memory usage in cases where you're calling `useDb()` for every request.
* @return {Connection} New Connection Object
* @api public
*/
NativeConnection.prototype.useDb = function(name, options) {
// Return immediately if cached
options = options || {};
if (options.useCache && this.relatedDbs[name]) {
return this.relatedDbs[name];
}
// we have to manually copy all of the attributes...
const newConn = new this.constructor();
newConn.name = name;
newConn.base = this.base;
newConn.collections = {};
newConn.models = {};
newConn.replica = this.replica;
newConn.config = Object.assign({}, this.config, newConn.config);
newConn.name = this.name;
newConn.options = this.options;
newConn._readyState = this._readyState;
newConn._closeCalled = this._closeCalled;
newConn._hasOpened = this._hasOpened;
newConn._listening = false;
newConn._parent = this;
newConn.host = this.host;
newConn.port = this.port;
newConn.user = this.user;
newConn.pass = this.pass;
// First, when we create another db object, we are not guaranteed to have a
// db object to work with. So, in the case where we have a db object and it
// is connected, we can just proceed with setting everything up. However, if
// we do not have a db or the state is not connected, then we need to wait on
// the 'open' event of the connection before doing the rest of the setup
// the 'connected' event is the first time we'll have access to the db object
const _this = this;
newConn.client = _this.client;
if (this.db && this._readyState === STATES.connected) {
wireup();
} else {
this._queue.push({ fn: wireup });
}
function wireup() {
newConn.client = _this.client;
const _opts = {};
if (options.hasOwnProperty('noListener')) {
_opts.noListener = options.noListener;
}
newConn.db = _this.client.db(name, _opts);
newConn._lastHeartbeatAt = _this._lastHeartbeatAt;
newConn.onOpen();
}
newConn.name = name;
// push onto the otherDbs stack, this is used when state changes
if (options.noListener !== true) {
this.otherDbs.push(newConn);
}
newConn.otherDbs.push(this);
// push onto the relatedDbs cache, this is used when state changes
if (options && options.useCache) {
this.relatedDbs[newConn.name] = newConn;
newConn.relatedDbs = this.relatedDbs;
}
return newConn;
};
/**
* Runs a [db-level aggregate()](https://www.mongodb.com/docs/manual/reference/method/db.aggregate/) on this connection's underlying `db`
*
* @param {Array} pipeline
* @param {Object} [options]
*/
NativeConnection.prototype.aggregate = function aggregate(pipeline, options) {
return new this.base.Aggregate(null, this).append(pipeline).option(options ?? {});
};
/**
* Removes the database connection with the given name created with `useDb()`.
*
* Throws an error if the database connection was not found.
*
* #### Example:
*
* // Connect to `initialdb` first
* const conn = await mongoose.createConnection('mongodb://127.0.0.1:27017/initialdb').asPromise();
*
* // Creates an un-cached connection to `mydb`
* const db = conn.useDb('mydb');
*
* // Closes `db`, and removes `db` from `conn.relatedDbs` and `conn.otherDbs`
* await conn.removeDb('mydb');
*
* @method removeDb
* @memberOf Connection
* @param {String} name The database name
* @return {Connection} this
*/
NativeConnection.prototype.removeDb = function removeDb(name) {
const dbs = this.otherDbs.filter(db => db.name === name);
if (!dbs.length) {
throw new MongooseError(`No connections to database "${name}" found`);
}
for (const db of dbs) {
db._closeCalled = true;
db._destroyCalled = true;
db._readyState = STATES.disconnected;
db.$wasForceClosed = true;
}
delete this.relatedDbs[name];
this.otherDbs = this.otherDbs.filter(db => db.name !== name);
};
/**
* Closes the connection
*
* @param {Boolean} [force]
* @return {Connection} this
* @api private
*/
NativeConnection.prototype.doClose = async function doClose(force) {
if (this.client == null) {
return this;
}
let skipCloseClient = false;
if (force != null && typeof force === 'object') {
skipCloseClient = force.skipCloseClient;
force = force.force;
}
if (skipCloseClient) {
return this;
}
await this.client.close(force);
// Defer because the driver will wait at least 1ms before finishing closing
// the pool, see https://github.com/mongodb-js/mongodb-core/blob/a8f8e4ce41936babc3b9112bf42d609779f03b39/lib/connection/pool.js#L1026-L1030.
// If there's queued operations, you may still get some background work
// after the callback is called.
await new Promise(resolve => setTimeout(resolve, 1));
return this;
};
/**
* Implementation of `listDatabases()` for MongoDB driver
*
* @return Promise
* @api public
*/
NativeConnection.prototype.listDatabases = async function listDatabases() {
await this._waitForConnect();
return await this.db.admin().listDatabases();
};
/*!
* ignore
*/
NativeConnection.prototype.createClient = async function createClient(uri, options) {
if (typeof uri !== 'string') {
throw new MongooseError('The `uri` parameter to `openUri()` must be a ' +
`string, got "${typeof uri}". Make sure the first parameter to ` +
'`mongoose.connect()` or `mongoose.createConnection()` is a string.');
}
if (this._destroyCalled) {
throw new MongooseError(
'Connection has been closed and destroyed, and cannot be used for re-opening the connection. ' +
'Please create a new connection with `mongoose.createConnection()` or `mongoose.connect()`.'
);
}
if (this.readyState === STATES.connecting || this.readyState === STATES.connected) {
if (this._connectionString !== uri) {
throw new MongooseError('Can\'t call `openUri()` on an active connection with ' +
'different connection strings. Make sure you aren\'t calling `mongoose.connect()` ' +
'multiple times. See: https://mongoosejs.com/docs/connections.html#multiple_connections');
}
}
options = processConnectionOptions(uri, options);
if (options) {
const autoIndex = options.config && options.config.autoIndex != null ?
options.config.autoIndex :
options.autoIndex;
if (autoIndex != null) {
this.config.autoIndex = autoIndex !== false;
delete options.config;
delete options.autoIndex;
}
if ('autoCreate' in options) {
this.config.autoCreate = !!options.autoCreate;
delete options.autoCreate;
}
if ('sanitizeFilter' in options) {
this.config.sanitizeFilter = options.sanitizeFilter;
delete options.sanitizeFilter;
}
if ('autoSearchIndex' in options) {
this.config.autoSearchIndex = options.autoSearchIndex;
delete options.autoSearchIndex;
}
if ('bufferTimeoutMS' in options) {
this.config.bufferTimeoutMS = options.bufferTimeoutMS;
delete options.bufferTimeoutMS;
}
// Backwards compat
if (options.user || options.pass) {
options.auth = options.auth || {};
options.auth.username = options.user;
options.auth.password = options.pass;
this.user = options.user;
this.pass = options.pass;
}
delete options.user;
delete options.pass;
if (options.bufferCommands != null) {
this.config.bufferCommands = options.bufferCommands;
delete options.bufferCommands;
}
} else {
options = {};
}
this._connectionOptions = options;
const dbName = options.dbName;
if (dbName != null) {
this.$dbName = dbName;
}
delete options.dbName;
if (!utils.hasUserDefinedProperty(options, 'driverInfo')) {
options.driverInfo = {
name: 'Mongoose',
version: pkg.version
};
}
const { schemaMap, encryptedFieldsMap } = this._buildEncryptionSchemas();
if ((Object.keys(schemaMap).length > 0 || Object.keys(encryptedFieldsMap).length) && !options.autoEncryption) {
throw new Error('Must provide `autoEncryption` when connecting with encrypted schemas.');
}
if (Object.keys(schemaMap).length > 0) {
options.autoEncryption.schemaMap = schemaMap;
}
if (Object.keys(encryptedFieldsMap).length > 0) {
options.autoEncryption.encryptedFieldsMap = encryptedFieldsMap;
}
this.readyState = STATES.connecting;
this._connectionString = uri;
let client;
try {
client = new mongodb.MongoClient(uri, options);
} catch (error) {
this.readyState = STATES.disconnected;
throw error;
}
this.client = client;
client.setMaxListeners(0);
await client.connect();
_setClient(this, client, options, dbName);
for (const db of this.otherDbs) {
_setClient(db, client, {}, db.name);
}
return this;
};
/**
* Given a connection, which may or may not have encrypted models, build
* a schemaMap and/or an encryptedFieldsMap for the connection, combining all models
* into a single schemaMap and encryptedFields map.
*
* @returns the generated schemaMap and encryptedFieldsMap
*/
NativeConnection.prototype._buildEncryptionSchemas = function() {
const qeMappings = {};
const csfleMappings = {};
const encryptedModels = Object.values(this.models).filter(model => model.schema._hasEncryptedFields());
// If discriminators are configured for the collection, there might be multiple models
// pointing to the same namespace. For this scenario, we merge all the schemas for each namespace
// into a single schema and then generate a schemaMap/encryptedFieldsMap for the combined schema.
for (const model of encryptedModels) {
const { schema, collection: { collectionName } } = model;
const namespace = `${this.$dbName}.${collectionName}`;
const mappings = schema.encryptionType() === 'csfle' ? csfleMappings : qeMappings;
mappings[namespace] ??= new Schema({}, { encryptionType: schema.encryptionType() });
const isNonRootDiscriminator = schema.discriminatorMapping && !schema.discriminatorMapping.isRoot;
if (isNonRootDiscriminator) {
const rootSchema = schema._baseSchema;
schema.eachPath((pathname) => {
if (rootSchema.path(pathname)) return;
if (!mappings[namespace]._hasEncryptedField(pathname)) return;
throw new Error(`Cannot have duplicate keys in discriminators with encryption. key=${pathname}`);
});
}
mappings[namespace].add(schema);
}
const schemaMap = Object.fromEntries(Object.entries(csfleMappings).map(
([namespace, schema]) => ([namespace, schema._buildSchemaMap()])
));
const encryptedFieldsMap = Object.fromEntries(Object.entries(qeMappings).map(
([namespace, schema]) => ([namespace, schema._buildEncryptedFields()])
));
return {
schemaMap, encryptedFieldsMap
};
};
/*!
* ignore
*/
NativeConnection.prototype.setClient = function setClient(client) {
if (!(client instanceof mongodb.MongoClient)) {
throw new MongooseError('Must call `setClient()` with an instance of MongoClient');
}
if (this.readyState !== STATES.disconnected) {
throw new MongooseError('Cannot call `setClient()` on a connection that is already connected.');
}
if (client.topology == null) {
throw new MongooseError('Cannot call `setClient()` with a MongoClient that you have not called `connect()` on yet.');
}
this._connectionString = client.s.url;
_setClient(this, client, {}, client.s.options.dbName);
for (const model of Object.values(this.models)) {
// Errors handled internally, so safe to ignore error
model.init().catch(function $modelInitNoop() {});
}
return this;
};
/*!
* ignore
*/
function _setClient(conn, client, options, dbName) {
const db = dbName != null ? client.db(dbName) : client.db();
conn.db = db;
conn.client = client;
conn.host = client &&
client.s &&
client.s.options &&
client.s.options.hosts &&
client.s.options.hosts[0] &&
client.s.options.hosts[0].host || void 0;
conn.port = client &&
client.s &&
client.s.options &&
client.s.options.hosts &&
client.s.options.hosts[0] &&
client.s.options.hosts[0].port || void 0;
conn.name = dbName != null ? dbName : db.databaseName;
conn._closeCalled = client._closeCalled;
const _handleReconnect = () => {
// If we aren't disconnected, we assume this reconnect is due to a
// socket timeout. If there's no activity on a socket for
// `socketTimeoutMS`, the driver will attempt to reconnect and emit
// this event.
if (conn.readyState !== STATES.connected) {
conn.readyState = STATES.connected;
conn.emit('reconnect');
conn.emit('reconnected');
conn.onOpen();
}
};
const type = client &&
client.topology &&
client.topology.description &&
client.topology.description.type || '';
if (type === 'Single') {
client.on('serverDescriptionChanged', ev => {
const newDescription = ev.newDescription;
if (newDescription.type === 'Unknown') {
conn.readyState = STATES.disconnected;
} else {
_handleReconnect();
}
});
} else if (type.startsWith('ReplicaSet')) {
client.on('topologyDescriptionChanged', ev => {
// Emit disconnected if we've lost connectivity to the primary
const description = ev.newDescription;
if (conn.readyState === STATES.connected && description.type !== 'ReplicaSetWithPrimary') {
// Implicitly emits 'disconnected'
conn.readyState = STATES.disconnected;
} else if (conn.readyState === STATES.disconnected && description.type === 'ReplicaSetWithPrimary') {
_handleReconnect();
}
});
}
conn._lastHeartbeatAt = null;
client.on('serverHeartbeatSucceeded', () => {
conn._lastHeartbeatAt = Date.now();
});
if (options.monitorCommands) {
client.on('commandStarted', (data) => conn.emit('commandStarted', data));
client.on('commandFailed', (data) => conn.emit('commandFailed', data));
client.on('commandSucceeded', (data) => conn.emit('commandSucceeded', data));
}
conn.onOpen();
for (const i in conn.collections) {
if (utils.object.hasOwnProperty(conn.collections, i)) {
conn.collections[i].onOpen();
}
}
}
/*!
* Module exports.
*/
module.exports = NativeConnection;

View file

@ -0,0 +1,10 @@
/*!
* Module exports.
*/
'use strict';
exports.BulkWriteResult = require('./bulkWriteResult');
exports.Collection = require('./collection');
exports.Connection = require('./connection');
exports.ClientEncryption = require('mongodb').ClientEncryption;

View file

@ -0,0 +1,29 @@
/*!
* Module dependencies.
*/
'use strict';
const MongooseError = require('./mongooseError');
/**
* MissingSchema Error constructor.
*/
class MissingSchemaError extends MongooseError {
constructor() {
super('Schema hasn\'t been registered for document.\n'
+ 'Use mongoose.Document(name, schema)');
}
}
Object.defineProperty(MissingSchemaError.prototype, 'name', {
value: 'MongooseError'
});
/*!
* exports
*/
module.exports = MissingSchemaError;

View file

@ -0,0 +1,44 @@
/*!
* Module dependencies.
*/
'use strict';
const MongooseError = require('./mongooseError');
/**
* If the underwriting `bulkWrite()` for `bulkSave()` succeeded, but wasn't able to update or
* insert all documents, we throw this error.
*
* @api private
*/
class MongooseBulkSaveIncompleteError extends MongooseError {
constructor(modelName, documents, bulkWriteResult) {
const matchedCount = bulkWriteResult?.matchedCount ?? 0;
const insertedCount = bulkWriteResult?.insertedCount ?? 0;
let preview = documents.map(doc => doc._id).join(', ');
if (preview.length > 100) {
preview = preview.slice(0, 100) + '...';
}
const numDocumentsNotUpdated = documents.length - matchedCount - insertedCount;
super(`${modelName}.bulkSave() was not able to update ${numDocumentsNotUpdated} of the given documents due to incorrect version or optimistic concurrency, document ids: ${preview}`);
this.modelName = modelName;
this.documents = documents;
this.bulkWriteResult = bulkWriteResult;
this.numDocumentsNotUpdated = numDocumentsNotUpdated;
}
}
Object.defineProperty(MongooseBulkSaveIncompleteError.prototype, 'name', {
value: 'MongooseBulkSaveIncompleteError'
});
/*!
* exports
*/
module.exports = MongooseBulkSaveIncompleteError;

View file

@ -0,0 +1,41 @@
/*!
* Module dependencies.
*/
'use strict';
const MongooseError = require('./');
/**
* If `bulkWrite()` or `insertMany()` has validation errors, but
* all valid operations succeed, and 'throwOnValidationError' is true,
* Mongoose will throw this error.
*
* @api private
*/
class MongooseBulkWriteError extends MongooseError {
constructor(validationErrors, results, rawResult, operation) {
let preview = validationErrors.map(e => e.message).join(', ');
if (preview.length > 200) {
preview = preview.slice(0, 200) + '...';
}
super(`${operation} failed with ${validationErrors.length} Mongoose validation errors: ${preview}`);
this.validationErrors = validationErrors;
this.results = results;
this.rawResult = rawResult;
this.operation = operation;
}
}
Object.defineProperty(MongooseBulkWriteError.prototype, 'name', {
value: 'MongooseBulkWriteError'
});
/*!
* exports
*/
module.exports = MongooseBulkWriteError;

View file

@ -0,0 +1,158 @@
'use strict';
/*!
* Module dependencies.
*/
const MongooseError = require('./mongooseError');
const util = require('util');
/**
* Casting Error constructor.
*
* @param {String} type
* @param {String} value
* @inherits MongooseError
* @api private
*/
class CastError extends MongooseError {
constructor(type, value, path, reason, schemaType) {
// If no args, assume we'll `init()` later.
if (arguments.length > 0) {
const valueType = getValueType(value);
const messageFormat = getMessageFormat(schemaType);
const msg = formatMessage(null, type, value, path, messageFormat, valueType, reason);
super(msg);
this.init(type, value, path, reason, schemaType);
} else {
super(formatMessage());
}
}
toJSON() {
return {
stringValue: this.stringValue,
valueType: this.valueType,
kind: this.kind,
value: this.value,
path: this.path,
reason: this.reason,
name: this.name,
message: this.message
};
}
/*!
* ignore
*/
init(type, value, path, reason, schemaType) {
this.stringValue = getStringValue(value);
this.messageFormat = getMessageFormat(schemaType);
this.kind = type;
this.value = value;
this.path = path;
this.reason = reason;
this.valueType = getValueType(value);
}
/**
* ignore
* @param {Readonly<CastError>} other
* @api private
*/
copy(other) {
this.messageFormat = other.messageFormat;
this.stringValue = other.stringValue;
this.kind = other.kind;
this.value = other.value;
this.path = other.path;
this.reason = other.reason;
this.message = other.message;
this.valueType = other.valueType;
}
/*!
* ignore
*/
setModel(model) {
this.message = formatMessage(model, this.kind, this.value, this.path,
this.messageFormat, this.valueType);
}
}
Object.defineProperty(CastError.prototype, 'name', {
value: 'CastError'
});
function getStringValue(value) {
let stringValue = util.inspect(value);
stringValue = stringValue.replace(/^'|'$/g, '"');
if (!stringValue.startsWith('"')) {
stringValue = '"' + stringValue + '"';
}
return stringValue;
}
function getValueType(value) {
if (value == null) {
return '' + value;
}
const t = typeof value;
if (t !== 'object') {
return t;
}
if (typeof value.constructor !== 'function') {
return t;
}
return value.constructor.name;
}
function getMessageFormat(schemaType) {
const messageFormat = schemaType && schemaType._castErrorMessage || null;
if (typeof messageFormat === 'string' || typeof messageFormat === 'function') {
return messageFormat;
}
}
/*!
* ignore
*/
function formatMessage(model, kind, value, path, messageFormat, valueType, reason) {
if (typeof messageFormat === 'string') {
const stringValue = getStringValue(value);
let ret = messageFormat.
replace('{KIND}', kind).
replace('{VALUE}', stringValue).
replace('{PATH}', path);
if (model != null) {
ret = ret.replace('{MODEL}', model.modelName);
}
return ret;
} else if (typeof messageFormat === 'function') {
return messageFormat(value, path, model, kind);
} else {
const stringValue = getStringValue(value);
const valueTypeMsg = valueType ? ' (type ' + valueType + ')' : '';
let ret = 'Cast to ' + kind + ' failed for value ' +
stringValue + valueTypeMsg + ' at path "' + path + '"';
if (model != null) {
ret += ' for model "' + model.modelName + '"';
}
if (reason != null &&
typeof reason.constructor === 'function' &&
reason.constructor.name !== 'AssertionError' &&
reason.constructor.name !== 'Error') {
ret += ' because of "' + reason.constructor.name + '"';
}
return ret;
}
}
/*!
* exports
*/
module.exports = CastError;

View file

@ -0,0 +1,26 @@
'use strict';
const MongooseError = require('./mongooseError');
/**
* createCollections Error constructor
*
* @param {String} message
* @param {String} errorsMap
* @inherits MongooseError
* @api private
*/
class CreateCollectionsError extends MongooseError {
constructor(message, errorsMap) {
super(message);
this.errors = errorsMap;
}
}
Object.defineProperty(CreateCollectionsError.prototype, 'name', {
value: 'CreateCollectionsError'
});
module.exports = CreateCollectionsError;

View file

@ -0,0 +1,40 @@
/*!
* Module dependencies.
*/
'use strict';
const MongooseError = require('./mongooseError');
/**
* DivergentArrayError constructor.
* @param {Array<String>} paths
* @api private
*/
class DivergentArrayError extends MongooseError {
constructor(paths) {
const msg = 'For your own good, using `document.save()` to update an array '
+ 'which was selected using an $elemMatch projection OR '
+ 'populated using skip, limit, query conditions, or exclusion of '
+ 'the _id field when the operation results in a $pop or $set of '
+ 'the entire array is not supported. The following '
+ 'path(s) would have been modified unsafely:\n'
+ ' ' + paths.join('\n ') + '\n'
+ 'Use Model.updateOne() to update these arrays instead.';
// TODO write up a docs page (FAQ) and link to it
super(msg);
}
}
Object.defineProperty(DivergentArrayError.prototype, 'name', {
value: 'DivergentArrayError'
});
/*!
* exports
*/
module.exports = DivergentArrayError;

View file

@ -0,0 +1,41 @@
/*!
* Module dependencies.
*/
'use strict';
const MongooseError = require('./mongooseError');
/**
* If `eachAsync()` is called with `continueOnError: true`, there can be
* multiple errors. This error class contains an `errors` property, which
* contains an array of all errors that occurred in `eachAsync()`.
*
* @api private
*/
class EachAsyncMultiError extends MongooseError {
/**
* @param {String} connectionString
*/
constructor(errors) {
let preview = errors.map(e => e.message).join(', ');
if (preview.length > 50) {
preview = preview.slice(0, 50) + '...';
}
super(`eachAsync() finished with ${errors.length} errors: ${preview}`);
this.errors = errors;
}
}
Object.defineProperty(EachAsyncMultiError.prototype, 'name', {
value: 'EachAsyncMultiError'
});
/*!
* exports
*/
module.exports = EachAsyncMultiError;

View file

@ -0,0 +1,237 @@
'use strict';
/**
* MongooseError constructor. MongooseError is the base class for all
* Mongoose-specific errors.
*
* #### Example:
*
* const Model = mongoose.model('Test', new mongoose.Schema({ answer: Number }));
* const doc = new Model({ answer: 'not a number' });
* const err = doc.validateSync();
*
* err instanceof mongoose.Error.ValidationError; // true
*
* @constructor Error
* @param {String} msg Error message
* @inherits Error https://developer.mozilla.org/en/JavaScript/Reference/Global_Objects/Error
*/
const MongooseError = require('./mongooseError');
/**
* The name of the error. The name uniquely identifies this Mongoose error. The
* possible values are:
*
* - `MongooseError`: general Mongoose error
* - `CastError`: Mongoose could not convert a value to the type defined in the schema path. May be in a `ValidationError` class' `errors` property.
* - `DivergentArrayError`: You attempted to `save()` an array that was modified after you loaded it with a `$elemMatch` or similar projection
* - `MissingSchemaError`: You tried to access a model with [`mongoose.model()`](https://mongoosejs.com/docs/api/mongoose.html#Mongoose.model()) that was not defined
* - `DocumentNotFoundError`: The document you tried to [`save()`](https://mongoosejs.com/docs/api/document.html#Document.prototype.save()) was not found
* - `ValidatorError`: error from an individual schema path's validator
* - `ValidationError`: error returned from [`validate()`](https://mongoosejs.com/docs/api/document.html#Document.prototype.validate()) or [`validateSync()`](https://mongoosejs.com/docs/api/document.html#Document.prototype.validateSync()). Contains zero or more `ValidatorError` instances in `.errors` property.
* - `MissingSchemaError`: You called `mongoose.Document()` without a schema
* - `ObjectExpectedError`: Thrown when you set a nested path to a non-object value with [strict mode set](https://mongoosejs.com/docs/guide.html#strict).
* - `ObjectParameterError`: Thrown when you pass a non-object value to a function which expects an object as a paramter
* - `OverwriteModelError`: Thrown when you call [`mongoose.model()`](https://mongoosejs.com/docs/api/mongoose.html#Mongoose.model()) to re-define a model that was already defined.
* - `ParallelSaveError`: Thrown when you call [`save()`](https://mongoosejs.com/docs/api/model.html#Model.prototype.save()) on a document when the same document instance is already saving.
* - `StrictModeError`: Thrown when you set a path that isn't the schema and [strict mode](https://mongoosejs.com/docs/guide.html#strict) is set to `throw`.
* - `VersionError`: Thrown when the [document is out of sync](https://mongoosejs.com/docs/guide.html#versionKey)
*
* @api public
* @property {String} name
* @memberOf Error
* @instance
*/
/*!
* Module exports.
*/
module.exports = exports = MongooseError;
/**
* The default built-in validator error messages.
*
* @see Error.messages https://mongoosejs.com/docs/api/error.html#Error.messages
* @api public
* @memberOf Error
* @static
*/
MongooseError.messages = require('./messages');
// backward compat
MongooseError.Messages = MongooseError.messages;
/**
* An instance of this error class will be thrown when mongoose failed to
* cast a value.
*
* @api public
* @memberOf Error
* @static
*/
MongooseError.CastError = require('./cast');
/**
* An instance of this error class will be thrown when `save()` fails
* because the underlying
* document was not found. The constructor takes one parameter, the
* conditions that mongoose passed to `updateOne()` when trying to update
* the document.
*
* @api public
* @memberOf Error
* @static
*/
MongooseError.DocumentNotFoundError = require('./notFound');
/**
* An instance of this error class will be thrown when [validation](https://mongoosejs.com/docs/validation.html) failed.
* The `errors` property contains an object whose keys are the paths that failed and whose values are
* instances of CastError or ValidationError.
*
* @api public
* @memberOf Error
* @static
*/
MongooseError.ValidationError = require('./validation');
/**
* A `ValidationError` has a hash of `errors` that contain individual
* `ValidatorError` instances.
*
* #### Example:
*
* const schema = Schema({ name: { type: String, required: true } });
* const Model = mongoose.model('Test', schema);
* const doc = new Model({});
*
* // Top-level error is a ValidationError, **not** a ValidatorError
* const err = doc.validateSync();
* err instanceof mongoose.Error.ValidationError; // true
*
* // A ValidationError `err` has 0 or more ValidatorErrors keyed by the
* // path in the `err.errors` property.
* err.errors['name'] instanceof mongoose.Error.ValidatorError;
*
* err.errors['name'].kind; // 'required'
* err.errors['name'].path; // 'name'
* err.errors['name'].value; // undefined
*
* Instances of `ValidatorError` have the following properties:
*
* - `kind`: The validator's `type`, like `'required'` or `'regexp'`
* - `path`: The path that failed validation
* - `value`: The value that failed validation
*
* @api public
* @memberOf Error
* @static
*/
MongooseError.ValidatorError = require('./validator');
/**
* An instance of this error class will be thrown when you call `save()` after
* the document in the database was changed in a potentially unsafe way. See
* the [`versionKey` option](https://mongoosejs.com/docs/guide.html#versionKey) for more information.
*
* @api public
* @memberOf Error
* @static
*/
MongooseError.VersionError = require('./version');
/**
* An instance of this error class will be thrown when you call `save()` multiple
* times on the same document in parallel. See the [FAQ](https://mongoosejs.com/docs/faq.html) for more
* information.
*
* @api public
* @memberOf Error
* @static
*/
MongooseError.ParallelSaveError = require('./parallelSave');
/**
* Thrown when a model with the given name was already registered on the connection.
* See [the FAQ about `OverwriteModelError`](https://mongoosejs.com/docs/faq.html#overwrite-model-error).
*
* @api public
* @memberOf Error
* @static
*/
MongooseError.OverwriteModelError = require('./overwriteModel');
/**
* Thrown when you try to access a model that has not been registered yet
*
* @api public
* @memberOf Error
* @static
*/
MongooseError.MissingSchemaError = require('./missingSchema');
/**
* Thrown when some documents failed to save when calling `bulkSave()`
*
* @api public
* @memberOf Error
* @static
*/
MongooseError.MongooseBulkSaveIncompleteError = require('./bulkSaveIncompleteError');
/**
* Thrown when the MongoDB Node driver can't connect to a valid server
* to send an operation to.
*
* @api public
* @memberOf Error
* @static
*/
MongooseError.MongooseServerSelectionError = require('./serverSelection');
/**
* An instance of this error will be thrown if you used an array projection
* and then modified the array in an unsafe way.
*
* @api public
* @memberOf Error
* @static
*/
MongooseError.DivergentArrayError = require('./divergentArray');
/**
* Thrown when your try to pass values to model constructor that
* were not specified in schema or change immutable properties when
* `strict` mode is `"throw"`
*
* @api public
* @memberOf Error
* @static
*/
MongooseError.StrictModeError = require('./strict');
/**
* An instance of this error class will be returned when mongoose failed to
* populate with a path that is not existing.
*
* @api public
* @memberOf Error
* @static
*/
MongooseError.StrictPopulateError = require('./strictPopulate');

View file

@ -0,0 +1,32 @@
/*!
* Module dependencies.
*/
'use strict';
const MongooseError = require('./mongooseError');
/**
* InvalidSchemaOption Error constructor.
* @param {String} name
* @api private
*/
class InvalidSchemaOptionError extends MongooseError {
constructor(name, option) {
const msg = `Cannot create use schema for property "${name}" because the schema has the ${option} option enabled.`;
super(msg);
}
}
Object.defineProperty(InvalidSchemaOptionError.prototype, 'name', {
value: 'InvalidSchemaOptionError'
});
/*!
* exports
*/
module.exports = InvalidSchemaOptionError;

View file

@ -0,0 +1,47 @@
/**
* The default built-in validator error messages. These may be customized.
*
* // customize within each schema or globally like so
* const mongoose = require('mongoose');
* mongoose.Error.messages.String.enum = "Your custom message for {PATH}.";
*
* Error messages support basic templating. Mongoose will replace the following strings with the corresponding value.
*
* - `{PATH}` is replaced with the invalid document path
* - `{VALUE}` is replaced with the invalid value
* - `{TYPE}` is replaced with the validator type such as "regexp", "min", or "user defined"
* - `{MIN}` is replaced with the declared min value for the Number.min validator
* - `{MAX}` is replaced with the declared max value for the Number.max validator
*
* Click the "show code" link below to see all defaults.
*
* @static
* @memberOf MongooseError
* @api public
*/
'use strict';
const msg = module.exports = exports = {};
msg.DocumentNotFoundError = null;
msg.general = {};
msg.general.default = 'Validator failed for path `{PATH}` with value `{VALUE}`';
msg.general.required = 'Path `{PATH}` is required.';
msg.Number = {};
msg.Number.min = 'Path `{PATH}` ({VALUE}) is less than minimum allowed value ({MIN}).';
msg.Number.max = 'Path `{PATH}` ({VALUE}) is more than maximum allowed value ({MAX}).';
msg.Number.enum = '`{VALUE}` is not a valid enum value for path `{PATH}`.';
msg.Date = {};
msg.Date.min = 'Path `{PATH}` ({VALUE}) is before minimum allowed value ({MIN}).';
msg.Date.max = 'Path `{PATH}` ({VALUE}) is after maximum allowed value ({MAX}).';
msg.String = {};
msg.String.enum = '`{VALUE}` is not a valid enum value for path `{PATH}`.';
msg.String.match = 'Path `{PATH}` is invalid ({VALUE}).';
msg.String.minlength = 'Path `{PATH}` (`{VALUE}`) is shorter than the minimum allowed length ({MINLENGTH}).';
msg.String.maxlength = 'Path `{PATH}` (`{VALUE}`) is longer than the maximum allowed length ({MAXLENGTH}).';

View file

@ -0,0 +1,33 @@
/*!
* Module dependencies.
*/
'use strict';
const MongooseError = require('./mongooseError');
/**
* MissingSchema Error constructor.
* @param {String} name
* @api private
*/
class MissingSchemaError extends MongooseError {
constructor(name) {
const msg = 'Schema hasn\'t been registered for model "' + name + '".\n'
+ 'Use mongoose.model(name, schema)';
super(msg);
}
}
Object.defineProperty(MissingSchemaError.prototype, 'name', {
value: 'MissingSchemaError'
});
/*!
* exports
*/
module.exports = MissingSchemaError;

View file

@ -0,0 +1,13 @@
'use strict';
/*!
* ignore
*/
class MongooseError extends Error { }
Object.defineProperty(MongooseError.prototype, 'name', {
value: 'MongooseError'
});
module.exports = MongooseError;

View file

@ -0,0 +1,47 @@
'use strict';
/*!
* Module dependencies.
*/
const MongooseError = require('./mongooseError');
const util = require('util');
/**
* OverwriteModel Error constructor.
* @api private
*/
class DocumentNotFoundError extends MongooseError {
constructor(filter, model, numAffected, result) {
let msg;
const messages = MongooseError.messages;
if (messages.DocumentNotFoundError != null) {
msg = typeof messages.DocumentNotFoundError === 'function' ?
messages.DocumentNotFoundError(filter, model) :
messages.DocumentNotFoundError;
} else {
msg = 'No document found for query "' + util.inspect(filter) +
'" on model "' + model + '"';
}
super(msg);
this.result = result;
this.numAffected = numAffected;
this.filter = filter;
// Backwards compat
this.query = filter;
}
}
Object.defineProperty(DocumentNotFoundError.prototype, 'name', {
value: 'DocumentNotFoundError'
});
/*!
* exports
*/
module.exports = DocumentNotFoundError;

View file

@ -0,0 +1,31 @@
/*!
* Module dependencies.
*/
'use strict';
const MongooseError = require('./mongooseError');
/**
* Strict mode error constructor
*
* @param {string} type
* @param {string} value
* @api private
*/
class ObjectExpectedError extends MongooseError {
constructor(path, val) {
const typeDescription = Array.isArray(val) ? 'array' : 'primitive value';
super('Tried to set nested object field `' + path +
`\` to ${typeDescription} \`` + val + '`');
this.path = path;
}
}
Object.defineProperty(ObjectExpectedError.prototype, 'name', {
value: 'ObjectExpectedError'
});
module.exports = ObjectExpectedError;

View file

@ -0,0 +1,32 @@
/*!
* Module dependencies.
*/
'use strict';
const MongooseError = require('./mongooseError');
/**
* Constructor for errors that happen when a parameter that's expected to be
* an object isn't an object
*
* @param {Any} value
* @param {String} paramName
* @param {String} fnName
* @api private
*/
class ObjectParameterError extends MongooseError {
constructor(value, paramName, fnName) {
super('Parameter "' + paramName + '" to ' + fnName +
'() must be an object, got "' + value.toString() + '" (type ' + typeof value + ')');
}
}
Object.defineProperty(ObjectParameterError.prototype, 'name', {
value: 'ObjectParameterError'
});
module.exports = ObjectParameterError;

View file

@ -0,0 +1,31 @@
/*!
* Module dependencies.
*/
'use strict';
const MongooseError = require('./mongooseError');
/**
* OverwriteModel Error constructor.
* @param {String} name
* @api private
*/
class OverwriteModelError extends MongooseError {
constructor(name) {
super('Cannot overwrite `' + name + '` model once compiled.');
}
}
Object.defineProperty(OverwriteModelError.prototype, 'name', {
value: 'OverwriteModelError'
});
/*!
* exports
*/
module.exports = OverwriteModelError;

View file

@ -0,0 +1,33 @@
'use strict';
/*!
* Module dependencies.
*/
const MongooseError = require('./mongooseError');
/**
* ParallelSave Error constructor.
*
* @param {Document} doc
* @api private
*/
class ParallelSaveError extends MongooseError {
constructor(doc) {
const msg = 'Can\'t save() the same doc multiple times in parallel. Document: ';
super(msg + doc._doc._id);
}
}
Object.defineProperty(ParallelSaveError.prototype, 'name', {
value: 'ParallelSaveError'
});
/*!
* exports
*/
module.exports = ParallelSaveError;

View file

@ -0,0 +1,33 @@
'use strict';
/*!
* Module dependencies.
*/
const MongooseError = require('./mongooseError');
/**
* ParallelValidate Error constructor.
*
* @param {Document} doc
* @api private
*/
class ParallelValidateError extends MongooseError {
constructor(doc) {
const msg = 'Can\'t validate() the same doc multiple times in parallel. Document: ';
super(msg + doc._doc._id);
}
}
Object.defineProperty(ParallelValidateError.prototype, 'name', {
value: 'ParallelValidateError'
});
/*!
* exports
*/
module.exports = ParallelValidateError;

View file

@ -0,0 +1,62 @@
/*!
* Module dependencies.
*/
'use strict';
const MongooseError = require('./mongooseError');
const allServersUnknown = require('../helpers/topology/allServersUnknown');
const isAtlas = require('../helpers/topology/isAtlas');
const isSSLError = require('../helpers/topology/isSSLError');
/*!
* ignore
*/
const atlasMessage = 'Could not connect to any servers in your MongoDB Atlas cluster. ' +
'One common reason is that you\'re trying to access the database from ' +
'an IP that isn\'t whitelisted. Make sure your current IP address is on your Atlas ' +
'cluster\'s IP whitelist: https://www.mongodb.com/docs/atlas/security-whitelist/';
const sslMessage = 'Mongoose is connecting with SSL enabled, but the server is ' +
'not accepting SSL connections. Please ensure that the MongoDB server you are ' +
'connecting to is configured to accept SSL connections. Learn more: ' +
'https://mongoosejs.com/docs/tutorials/ssl.html';
class MongooseServerSelectionError extends MongooseError {
/**
* MongooseServerSelectionError constructor
*
* @api private
*/
assimilateError(err) {
const reason = err.reason;
// Special message for a case that is likely due to IP whitelisting issues.
const isAtlasWhitelistError = isAtlas(reason) &&
allServersUnknown(reason) &&
err.message.indexOf('bad auth') === -1 &&
err.message.indexOf('Authentication failed') === -1;
if (isAtlasWhitelistError) {
this.message = atlasMessage;
} else if (isSSLError(reason)) {
this.message = sslMessage;
} else {
this.message = err.message;
}
for (const key in err) {
if (key !== 'name') {
this[key] = err[key];
}
}
this.cause = reason;
return this;
}
}
Object.defineProperty(MongooseServerSelectionError.prototype, 'name', {
value: 'MongooseServerSelectionError'
});
module.exports = MongooseServerSelectionError;

View file

@ -0,0 +1,103 @@
/*!
* Module requirements
*/
'use strict';
const MongooseError = require('./mongooseError');
const util = require('util');
const combinePathErrors = require('../helpers/error/combinePathErrors');
/**
* Mongoose.set Error
*
* @api private
* @inherits MongooseError
*/
class SetOptionError extends MongooseError {
constructor() {
super('');
this.errors = {};
}
/**
* Console.log helper
*/
toString() {
return combinePathErrors(this);
}
/**
* inspect helper
* @api private
*/
inspect() {
return Object.assign(new Error(this.message), this);
}
/**
* add message
* @param {String} key
* @param {String|Error} error
* @api private
*/
addError(key, error) {
if (error instanceof SetOptionError) {
const { errors } = error;
for (const optionKey of Object.keys(errors)) {
this.addError(optionKey, errors[optionKey]);
}
return;
}
this.errors[key] = error;
this.message = combinePathErrors(this);
}
}
if (util.inspect.custom) {
// Avoid Node deprecation warning DEP0079
SetOptionError.prototype[util.inspect.custom] = SetOptionError.prototype.inspect;
}
/**
* Helper for JSON.stringify
* Ensure `name` and `message` show up in toJSON output re: gh-9847
* @api private
*/
Object.defineProperty(SetOptionError.prototype, 'toJSON', {
enumerable: false,
writable: false,
configurable: true,
value: function() {
return Object.assign({}, this, { name: this.name, message: this.message });
}
});
Object.defineProperty(SetOptionError.prototype, 'name', {
value: 'SetOptionError'
});
class SetOptionInnerError extends MongooseError {
/**
* Error for the "errors" array in "SetOptionError" with consistent message
* @param {String} key
*/
constructor(key) {
super(`"${key}" is not a valid option to set`);
}
}
SetOptionError.SetOptionInnerError = SetOptionInnerError;
/*!
* Module exports
*/
module.exports = SetOptionError;

View file

@ -0,0 +1,35 @@
/*!
* Module dependencies.
*/
'use strict';
const MongooseError = require('./mongooseError');
/**
* Strict mode error constructor
*
* @param {String} path
* @param {String} [msg]
* @param {Boolean} [immutable]
* @inherits MongooseError
* @api private
*/
class StrictModeError extends MongooseError {
constructor(path, msg, immutable) {
msg = msg || 'Field `' + path + '` is not in schema and strict ' +
'mode is set to throw.';
super(msg);
this.isImmutableError = !!immutable;
this.path = path;
}
}
Object.defineProperty(StrictModeError.prototype, 'name', {
value: 'StrictModeError'
});
module.exports = StrictModeError;

View file

@ -0,0 +1,31 @@
/*!
* Module dependencies.
*/
'use strict';
const MongooseError = require('./mongooseError');
/**
* Strict mode error constructor
*
* @param {String} path
* @param {String} [msg]
* @inherits MongooseError
* @api private
*/
class StrictPopulateError extends MongooseError {
constructor(path, msg) {
msg = msg || 'Cannot populate path `' + path + '` because it is not in your schema. ' + 'Set the `strictPopulate` option to false to override.';
super(msg);
this.path = path;
}
}
Object.defineProperty(StrictPopulateError.prototype, 'name', {
value: 'StrictPopulateError'
});
module.exports = StrictPopulateError;

View file

@ -0,0 +1,30 @@
'use strict';
/*!
* Module dependencies.
*/
const MongooseError = require('./mongooseError');
/**
* SyncIndexes Error constructor.
*
* @param {String} message
* @param {String} errorsMap
* @inherits MongooseError
* @api private
*/
class SyncIndexesError extends MongooseError {
constructor(message, errorsMap) {
super(message);
this.errors = errorsMap;
}
}
Object.defineProperty(SyncIndexesError.prototype, 'name', {
value: 'SyncIndexesError'
});
module.exports = SyncIndexesError;

View file

@ -0,0 +1,105 @@
/*!
* Module requirements
*/
'use strict';
const MongooseError = require('./mongooseError');
const getConstructorName = require('../helpers/getConstructorName');
const util = require('util');
const combinePathErrors = require('../helpers/error/combinePathErrors');
/**
* Document Validation Error
*
* @api private
* @param {Document} [instance]
* @inherits MongooseError
*/
class ValidationError extends MongooseError {
constructor(instance) {
let _message;
if (getConstructorName(instance) === 'model') {
_message = instance.constructor.modelName + ' validation failed';
} else {
_message = 'Validation failed';
}
super(_message);
this.errors = {};
this._message = _message;
if (instance) {
instance.$errors = this.errors;
}
}
/**
* Console.log helper
*/
toString() {
return this.name + ': ' + combinePathErrors(this);
}
/**
* inspect helper
* @api private
*/
inspect() {
return Object.assign(new Error(this.message), this);
}
/**
* add message
* @param {String} path
* @param {String|Error} error
* @api private
*/
addError(path, error) {
if (error instanceof ValidationError) {
const { errors } = error;
for (const errorPath of Object.keys(errors)) {
this.addError(`${path}.${errorPath}`, errors[errorPath]);
}
return;
}
this.errors[path] = error;
this.message = this._message + ': ' + combinePathErrors(this);
}
}
if (util.inspect.custom) {
// Avoid Node deprecation warning DEP0079
ValidationError.prototype[util.inspect.custom] = ValidationError.prototype.inspect;
}
/**
* Helper for JSON.stringify
* Ensure `name` and `message` show up in toJSON output re: gh-9847
* @api private
*/
Object.defineProperty(ValidationError.prototype, 'toJSON', {
enumerable: false,
writable: false,
configurable: true,
value: function() {
return Object.assign({}, this, { name: this.name, message: this.message });
}
});
Object.defineProperty(ValidationError.prototype, 'name', {
value: 'ValidationError'
});
/*!
* Module exports
*/
module.exports = ValidationError;

View file

@ -0,0 +1,100 @@
/*!
* Module dependencies.
*/
'use strict';
const MongooseError = require('./mongooseError');
/**
* Schema validator error
*
* @param {Object} properties
* @param {Document} doc
* @api private
*/
class ValidatorError extends MongooseError {
constructor(properties, doc) {
let msg = properties.message;
if (!msg) {
msg = MongooseError.messages.general.default;
}
const message = formatMessage(msg, properties, doc);
super(message);
properties = Object.assign({}, properties, { message: message });
this.properties = properties;
this.kind = properties.type;
this.path = properties.path;
this.value = properties.value;
this.reason = properties.reason;
}
/**
* toString helper
* TODO remove? This defaults to `${this.name}: ${this.message}`
* @api private
*/
toString() {
return this.message;
}
/**
* Ensure `name` and `message` show up in toJSON output re: gh-9296
* @api private
*/
toJSON() {
return Object.assign({ name: this.name, message: this.message }, this);
}
}
Object.defineProperty(ValidatorError.prototype, 'name', {
value: 'ValidatorError'
});
/**
* The object used to define this validator. Not enumerable to hide
* it from `require('util').inspect()` output re: gh-3925
* @api private
*/
Object.defineProperty(ValidatorError.prototype, 'properties', {
enumerable: false,
writable: true,
value: null
});
// Exposed for testing
ValidatorError.prototype.formatMessage = formatMessage;
/**
* Formats error messages
* @api private
*/
function formatMessage(msg, properties, doc) {
if (typeof msg === 'function') {
return msg(properties, doc);
}
const propertyNames = Object.keys(properties);
for (const propertyName of propertyNames) {
if (propertyName === 'message') {
continue;
}
msg = msg.replace('{' + propertyName.toUpperCase() + '}', properties[propertyName]);
}
return msg;
}
/*!
* exports
*/
module.exports = ValidatorError;

View file

@ -0,0 +1,38 @@
'use strict';
/*!
* Module dependencies.
*/
const MongooseError = require('./mongooseError');
/**
* Version Error constructor.
*
* @param {Document} doc
* @param {Number} currentVersion
* @param {Array<String>} modifiedPaths
* @api private
*/
class VersionError extends MongooseError {
constructor(doc, currentVersion, modifiedPaths) {
const modifiedPathsStr = modifiedPaths.join(', ');
super('No matching document found for id "' + doc._doc._id +
'" version ' + currentVersion + ' modifiedPaths "' + modifiedPathsStr + '"');
this.version = currentVersion;
this.modifiedPaths = modifiedPaths;
}
}
Object.defineProperty(VersionError.prototype, 'name', {
value: 'VersionError'
});
/*!
* exports
*/
module.exports = VersionError;

View file

@ -0,0 +1,39 @@
'use strict';
module.exports = function prepareDiscriminatorPipeline(pipeline, schema, prefix) {
const discriminatorMapping = schema && schema.discriminatorMapping;
prefix = prefix || '';
if (discriminatorMapping && !discriminatorMapping.isRoot) {
const originalPipeline = pipeline;
const filterKey = (prefix.length > 0 ? prefix + '.' : prefix) + discriminatorMapping.key;
const discriminatorValue = discriminatorMapping.value;
// If the first pipeline stage is a match and it doesn't specify a `__t`
// key, add the discriminator key to it. This allows for potential
// aggregation query optimizations not to be disturbed by this feature.
if (originalPipeline[0] != null &&
originalPipeline[0].$match &&
(originalPipeline[0].$match[filterKey] === undefined || originalPipeline[0].$match[filterKey] === discriminatorValue)) {
originalPipeline[0].$match[filterKey] = discriminatorValue;
// `originalPipeline` is a ref, so there's no need for
// aggregate._pipeline = originalPipeline
} else if (originalPipeline[0] != null && originalPipeline[0].$geoNear) {
originalPipeline[0].$geoNear.query =
originalPipeline[0].$geoNear.query || {};
originalPipeline[0].$geoNear.query[filterKey] = discriminatorValue;
} else if (originalPipeline[0] != null && originalPipeline[0].$search) {
if (originalPipeline[1] && originalPipeline[1].$match != null) {
originalPipeline[1].$match[filterKey] = originalPipeline[1].$match[filterKey] || discriminatorValue;
} else {
const match = {};
match[filterKey] = discriminatorValue;
originalPipeline.splice(1, 0, { $match: match });
}
} else {
const match = {};
match[filterKey] = discriminatorValue;
originalPipeline.unshift({ $match: match });
}
}
};

View file

@ -0,0 +1,50 @@
'use strict';
module.exports = function stringifyFunctionOperators(pipeline) {
if (!Array.isArray(pipeline)) {
return;
}
for (const stage of pipeline) {
if (stage == null) {
continue;
}
const canHaveAccumulator = stage.$group || stage.$bucket || stage.$bucketAuto;
if (canHaveAccumulator != null) {
for (const key of Object.keys(canHaveAccumulator)) {
handleAccumulator(canHaveAccumulator[key]);
}
}
const stageType = Object.keys(stage)[0];
if (stageType && typeof stage[stageType] === 'object') {
const stageOptions = stage[stageType];
for (const key of Object.keys(stageOptions)) {
if (stageOptions[key] != null &&
stageOptions[key].$function != null &&
typeof stageOptions[key].$function.body === 'function') {
stageOptions[key].$function.body = stageOptions[key].$function.body.toString();
}
}
}
if (stage.$facet != null) {
for (const key of Object.keys(stage.$facet)) {
stringifyFunctionOperators(stage.$facet[key]);
}
}
}
};
function handleAccumulator(operator) {
if (operator == null || operator.$accumulator == null) {
return;
}
for (const key of ['init', 'accumulate', 'merge', 'finalize']) {
if (typeof operator.$accumulator[key] === 'function') {
operator.$accumulator[key] = String(operator.$accumulator[key]);
}
}
}

View file

@ -0,0 +1,33 @@
'use strict';
module.exports = arrayDepth;
function arrayDepth(arr) {
if (!Array.isArray(arr)) {
return { min: 0, max: 0, containsNonArrayItem: true };
}
if (arr.length === 0) {
return { min: 1, max: 1, containsNonArrayItem: false };
}
if (arr.length === 1 && !Array.isArray(arr[0])) {
return { min: 1, max: 1, containsNonArrayItem: false };
}
const res = arrayDepth(arr[0]);
for (let i = 1; i < arr.length; ++i) {
const _res = arrayDepth(arr[i]);
if (_res.min < res.min) {
res.min = _res.min;
}
if (_res.max > res.max) {
res.max = _res.max;
}
res.containsNonArrayItem = res.containsNonArrayItem || _res.containsNonArrayItem;
}
res.min = res.min + 1;
res.max = res.max + 1;
return res;
}

View file

@ -0,0 +1,190 @@
'use strict';
const Decimal = require('../types/decimal128');
const ObjectId = require('../types/objectid');
const specialProperties = require('./specialProperties');
const isMongooseObject = require('./isMongooseObject');
const getFunctionName = require('./getFunctionName');
const isBsonType = require('./isBsonType');
const isMongooseArray = require('../types/array/isMongooseArray').isMongooseArray;
const isObject = require('./isObject');
const isPOJO = require('./isPOJO');
const symbols = require('./symbols');
const trustedSymbol = require('./query/trusted').trustedSymbol;
const BSON = require('bson');
/**
* Object clone with Mongoose natives support.
*
* If options.minimize is true, creates a minimal data object. Empty objects and undefined values will not be cloned. This makes the data payload sent to MongoDB as small as possible.
*
* Functions and primitives are never cloned.
*
* @param {Object} obj the object to clone
* @param {Object} options
* @param {Boolean} isArrayChild true if cloning immediately underneath an array. Special case for minimize.
* @return {Object} the cloned object
* @api private
*/
function clone(obj, options, isArrayChild) {
if (obj == null) {
return obj;
}
if (isBsonType(obj, 'Double')) {
return new BSON.Double(obj.value);
}
if (typeof obj === 'number' || typeof obj === 'string' || typeof obj === 'boolean' || typeof obj === 'bigint') {
return obj;
}
if (Array.isArray(obj)) {
return cloneArray(isMongooseArray(obj) ? obj.__array : obj, options);
}
if (isMongooseObject(obj)) {
if (options) {
if (options.retainDocuments && obj.$__ != null) {
const clonedDoc = obj.$clone();
if (obj.__index != null) {
clonedDoc.__index = obj.__index;
}
if (obj.__parentArray != null) {
clonedDoc.__parentArray = obj.__parentArray;
}
clonedDoc.$__parent = obj.$__parent;
return clonedDoc;
}
}
if (isPOJO(obj) && obj.$__ != null && obj._doc != null) {
return obj._doc;
}
let ret;
if (options && options.json && typeof obj.toJSON === 'function') {
ret = obj.toJSON(options);
} else {
ret = obj.toObject(options);
}
return ret;
}
const objConstructor = obj.constructor;
if (objConstructor) {
switch (getFunctionName(objConstructor)) {
case 'Object':
return cloneObject(obj, options, isArrayChild);
case 'Date':
return new objConstructor(+obj);
case 'RegExp':
return cloneRegExp(obj);
default:
// ignore
break;
}
}
if (isBsonType(obj, 'ObjectId')) {
if (options && options.flattenObjectIds) {
return obj.toJSON();
}
return new ObjectId(obj.id);
}
if (isBsonType(obj, 'Decimal128')) {
if (options && options.flattenDecimals) {
return obj.toJSON();
}
return Decimal.fromString(obj.toString());
}
// object created with Object.create(null)
if (!objConstructor && isObject(obj)) {
return cloneObject(obj, options, isArrayChild);
}
if (typeof obj === 'object' && obj[symbols.schemaTypeSymbol]) {
return obj.clone();
}
// If we're cloning this object to go into a MongoDB command,
// and there's a `toBSON()` function, assume this object will be
// stored as a primitive in MongoDB and doesn't need to be cloned.
if (options && options.bson && typeof obj.toBSON === 'function') {
return obj;
}
if (typeof obj.valueOf === 'function') {
return obj.valueOf();
}
return cloneObject(obj, options, isArrayChild);
}
module.exports = clone;
/*!
* ignore
*/
function cloneObject(obj, options, isArrayChild) {
const minimize = options && options.minimize;
const omitUndefined = options && options.omitUndefined;
const seen = options && options._seen;
const ret = {};
let hasKeys;
if (seen && seen.has(obj)) {
return seen.get(obj);
} else if (seen) {
seen.set(obj, ret);
}
if (trustedSymbol in obj && options?.copyTrustedSymbol !== false) {
ret[trustedSymbol] = obj[trustedSymbol];
}
const keys = Object.keys(obj);
const len = keys.length;
for (let i = 0; i < len; ++i) {
const key = keys[i];
if (specialProperties.has(key)) {
continue;
}
// Don't pass `isArrayChild` down
const val = clone(obj[key], options, false);
if ((minimize === false || omitUndefined) && typeof val === 'undefined') {
delete ret[key];
} else if (minimize !== true || (typeof val !== 'undefined')) {
hasKeys || (hasKeys = true);
ret[key] = val;
}
}
return minimize && !isArrayChild ? hasKeys && ret : ret;
}
function cloneArray(arr, options) {
let i = 0;
const len = arr.length;
const ret = new Array(len);
for (i = 0; i < len; ++i) {
ret[i] = clone(arr[i], options, true);
}
return ret;
}
function cloneRegExp(regexp) {
const ret = new RegExp(regexp.source, regexp.flags);
if (ret.lastIndex !== regexp.lastIndex) {
ret.lastIndex = regexp.lastIndex;
}
return ret;
}

View file

@ -0,0 +1,127 @@
'use strict';
/*!
* Module dependencies.
*/
const Binary = require('bson').Binary;
const isBsonType = require('./isBsonType');
const isMongooseObject = require('./isMongooseObject');
const MongooseError = require('../error');
const util = require('util');
exports.flatten = flatten;
exports.modifiedPaths = modifiedPaths;
/*!
* ignore
*/
function flatten(update, path, options, schema) {
let keys;
if (update && isMongooseObject(update) && !Buffer.isBuffer(update)) {
keys = Object.keys(update.toObject({ transform: false, virtuals: false }) || {});
} else {
keys = Object.keys(update || {});
}
const numKeys = keys.length;
const result = {};
path = path ? path + '.' : '';
for (let i = 0; i < numKeys; ++i) {
const key = keys[i];
const val = update[key];
result[path + key] = val;
// Avoid going into mixed paths if schema is specified
const keySchema = schema && schema.path && schema.path(path + key);
const isNested = schema && schema.nested && schema.nested[path + key];
if (keySchema && keySchema.instance === 'Mixed') continue;
if (shouldFlatten(val)) {
if (options && options.skipArrays && Array.isArray(val)) {
continue;
}
const flat = flatten(val, path + key, options, schema);
for (const k in flat) {
result[k] = flat[k];
}
if (Array.isArray(val)) {
result[path + key] = val;
}
}
if (isNested) {
const paths = Object.keys(schema.paths);
for (const p of paths) {
if (p.startsWith(path + key + '.') && !result.hasOwnProperty(p)) {
result[p] = void 0;
}
}
}
}
return result;
}
/*!
* ignore
*/
function modifiedPaths(update, path, result, recursion = null) {
if (update == null || typeof update !== 'object') {
return;
}
if (recursion == null) {
recursion = {
raw: { update, path },
trace: new WeakSet()
};
}
if (recursion.trace.has(update)) {
throw new MongooseError(`a circular reference in the update value, updateValue:
${util.inspect(recursion.raw.update, { showHidden: false, depth: 1 })}
updatePath: '${recursion.raw.path}'`);
}
recursion.trace.add(update);
const keys = Object.keys(update || {});
const numKeys = keys.length;
result = result || {};
path = path ? path + '.' : '';
for (let i = 0; i < numKeys; ++i) {
const key = keys[i];
let val = update[key];
const _path = path + key;
result[_path] = true;
if (!Buffer.isBuffer(val) && isMongooseObject(val)) {
val = val.toObject({ transform: false, virtuals: false });
}
if (shouldFlatten(val)) {
modifiedPaths(val, path + key, result, recursion);
}
}
recursion.trace.delete(update);
return result;
}
/*!
* ignore
*/
function shouldFlatten(val) {
return val &&
typeof val === 'object' &&
!(val instanceof Date) &&
!isBsonType(val, 'ObjectId') &&
(!Array.isArray(val) || val.length !== 0) &&
!(val instanceof Buffer) &&
!isBsonType(val, 'Decimal128') &&
!(val instanceof Binary);
}

View file

@ -0,0 +1,24 @@
'use strict';
/**
* Handles creating `{ type: 'object' }` vs `{ bsonType: 'object' }` vs `{ bsonType: ['object', 'null'] }`
*
* @param {String} type
* @param {String} bsonType
* @param {Boolean} useBsonType
* @param {Boolean} isRequired
*/
module.exports = function createJSONSchemaTypeArray(type, bsonType, useBsonType, isRequired) {
if (useBsonType) {
if (isRequired) {
return { bsonType };
}
return { bsonType: [bsonType, 'null'] };
} else {
if (isRequired) {
return { type };
}
return { type: [type, 'null'] };
}
};

View file

@ -0,0 +1,225 @@
'use strict';
/*!
* Module dependencies.
*/
const EachAsyncMultiError = require('../../error/eachAsyncMultiError');
const immediate = require('../immediate');
/**
* Execute `fn` for every document in the cursor. If `fn` returns a promise,
* will wait for the promise to resolve before iterating on to the next one.
* Returns a promise that resolves when done.
*
* @param {Function} next the thunk to call to get the next document
* @param {Function} fn
* @param {Object} options
* @param {Number} [options.batchSize=null] if set, Mongoose will call `fn` with an array of at most `batchSize` documents, instead of a single document
* @param {Number} [options.parallel=1] maximum number of `fn` calls that Mongoose will run in parallel
* @param {AbortSignal} [options.signal] allow cancelling this eachAsync(). Once the abort signal is fired, `eachAsync()` will immediately fulfill the returned promise (or call the callback) and not fetch any more documents.
* @return {Promise}
* @api public
* @method eachAsync
*/
module.exports = async function eachAsync(next, fn, options) {
const parallel = options.parallel || 1;
const batchSize = options.batchSize;
const signal = options.signal;
const continueOnError = options.continueOnError;
const aggregatedErrors = [];
const enqueue = asyncQueue();
let aborted = false;
return new Promise((resolve, reject) => {
if (signal != null) {
if (signal.aborted) {
return resolve(null);
}
signal.addEventListener('abort', () => {
aborted = true;
return resolve(null);
}, { once: true });
}
if (batchSize != null) {
if (typeof batchSize !== 'number') {
throw new TypeError('batchSize must be a number');
} else if (!Number.isInteger(batchSize)) {
throw new TypeError('batchSize must be an integer');
} else if (batchSize < 1) {
throw new TypeError('batchSize must be at least 1');
}
}
iterate((err, res) => {
if (err != null) {
return reject(err);
}
resolve(res);
});
});
function iterate(finalCallback) {
let handleResultsInProgress = 0;
let currentDocumentIndex = 0;
let error = null;
for (let i = 0; i < parallel; ++i) {
enqueue(createFetch());
}
function createFetch() {
let documentsBatch = [];
let drained = false;
return fetch;
function fetch(done) {
if (drained || aborted) {
return done();
} else if (error) {
return done();
}
next(function(err, doc) {
if (error != null) {
return done();
}
if (err != null) {
if (err.name === 'MongoCursorExhaustedError') {
// We may end up calling `next()` multiple times on an exhausted
// cursor, which leads to an error. In case cursor is exhausted,
// just treat it as if the cursor returned no document, which is
// how a cursor indicates it is exhausted.
doc = null;
} else if (continueOnError) {
aggregatedErrors.push(err);
} else {
error = err;
finalCallback(err);
return done();
}
}
if (doc == null) {
drained = true;
if (handleResultsInProgress <= 0) {
const finalErr = continueOnError ?
createEachAsyncMultiError(aggregatedErrors) :
error;
finalCallback(finalErr);
} else if (batchSize && documentsBatch.length) {
handleNextResult(documentsBatch, currentDocumentIndex++, handleNextResultCallBack);
}
return done();
}
++handleResultsInProgress;
// Kick off the subsequent `next()` before handling the result, but
// make sure we know that we still have a result to handle re: #8422
immediate(() => done());
if (batchSize) {
documentsBatch.push(doc);
}
// If the current documents size is less than the provided batch size don't process the documents yet
if (batchSize && documentsBatch.length !== batchSize) {
immediate(() => enqueue(fetch));
return;
}
const docsToProcess = batchSize ? documentsBatch : doc;
function handleNextResultCallBack(err) {
if (batchSize) {
handleResultsInProgress -= documentsBatch.length;
documentsBatch = [];
} else {
--handleResultsInProgress;
}
if (err != null) {
if (continueOnError) {
aggregatedErrors.push(err);
} else {
error = err;
return finalCallback(err);
}
}
if ((drained || aborted) && handleResultsInProgress <= 0) {
const finalErr = continueOnError ?
createEachAsyncMultiError(aggregatedErrors) :
error;
return finalCallback(finalErr);
}
immediate(() => enqueue(fetch));
}
handleNextResult(docsToProcess, currentDocumentIndex++, handleNextResultCallBack);
});
}
}
}
function handleNextResult(doc, i, callback) {
let maybePromise;
try {
maybePromise = fn(doc, i);
} catch (err) {
return callback(err);
}
if (maybePromise && typeof maybePromise.then === 'function') {
maybePromise.then(
function() { callback(null); },
function(error) {
callback(error || new Error('`eachAsync()` promise rejected without error'));
});
} else {
callback(null);
}
}
};
// `next()` can only execute one at a time, so make sure we always execute
// `next()` in series, while still allowing multiple `fn()` instances to run
// in parallel.
function asyncQueue() {
const _queue = [];
let inProgress = null;
let id = 0;
return function enqueue(fn) {
if (
inProgress === null &&
_queue.length === 0
) {
inProgress = id++;
return fn(_step);
}
_queue.push(fn);
};
function _step() {
if (_queue.length !== 0) {
inProgress = id++;
const fn = _queue.shift();
fn(_step);
} else {
inProgress = null;
}
}
}
function createEachAsyncMultiError(aggregatedErrors) {
if (aggregatedErrors.length === 0) {
return null;
}
return new EachAsyncMultiError(aggregatedErrors);
}

View file

@ -0,0 +1,36 @@
'use strict';
module.exports = applyEmbeddedDiscriminators;
function applyEmbeddedDiscriminators(schema, seen = new WeakSet(), overwriteExisting = false) {
if (seen.has(schema)) {
return;
}
seen.add(schema);
for (const path of Object.keys(schema.paths)) {
const schemaType = schema.paths[path];
if (!schemaType.schema) {
continue;
}
applyEmbeddedDiscriminators(schemaType.schema, seen);
if (!schemaType.schema._applyDiscriminators) {
continue;
}
if (schemaType._appliedDiscriminators && !overwriteExisting) {
continue;
}
for (const discriminatorKey of schemaType.schema._applyDiscriminators.keys()) {
const {
schema: discriminatorSchema,
options
} = schemaType.schema._applyDiscriminators.get(discriminatorKey);
applyEmbeddedDiscriminators(discriminatorSchema, seen);
schemaType.discriminator(
discriminatorKey,
discriminatorSchema,
overwriteExisting ? { ...options, overwriteExisting: true } : options
);
}
schemaType._appliedDiscriminators = true;
}
}

View file

@ -0,0 +1,16 @@
'use strict';
const isBsonType = require('../isBsonType');
module.exports = function areDiscriminatorValuesEqual(a, b) {
if (typeof a === 'string' && typeof b === 'string') {
return a === b;
}
if (typeof a === 'number' && typeof b === 'number') {
return a === b;
}
if (isBsonType(a, 'ObjectId') && isBsonType(b, 'ObjectId')) {
return a.toString() === b.toString();
}
return false;
};

View file

@ -0,0 +1,12 @@
'use strict';
module.exports = function checkEmbeddedDiscriminatorKeyProjection(userProjection, path, schema, selected, addedPaths) {
const userProjectedInPath = Object.keys(userProjection).
reduce((cur, key) => cur || key.startsWith(path + '.'), false);
const _discriminatorKey = path + '.' + schema.options.discriminatorKey;
if (!userProjectedInPath &&
addedPaths.length === 1 &&
addedPaths[0] === _discriminatorKey) {
selected.splice(selected.indexOf(_discriminatorKey), 1);
}
};

View file

@ -0,0 +1,29 @@
'use strict';
const getDiscriminatorByValue = require('./getDiscriminatorByValue');
/**
* Find the correct constructor, taking into account discriminators
* @api private
*/
module.exports = function getConstructor(Constructor, value, defaultDiscriminatorValue) {
const discriminatorKey = Constructor.schema.options.discriminatorKey;
let discriminatorValue = (value != null && value[discriminatorKey]);
if (discriminatorValue == null) {
discriminatorValue = defaultDiscriminatorValue;
}
if (Constructor.discriminators &&
discriminatorValue != null) {
if (Constructor.discriminators[discriminatorValue]) {
Constructor = Constructor.discriminators[discriminatorValue];
} else {
const constructorByValue = getDiscriminatorByValue(Constructor.discriminators, discriminatorValue);
if (constructorByValue) {
Constructor = constructorByValue;
}
}
}
return Constructor;
};

View file

@ -0,0 +1,28 @@
'use strict';
const areDiscriminatorValuesEqual = require('./areDiscriminatorValuesEqual');
/**
* returns discriminator by discriminatorMapping.value
*
* @param {Object} discriminators
* @param {string} value
* @api private
*/
module.exports = function getDiscriminatorByValue(discriminators, value) {
if (discriminators == null) {
return null;
}
for (const name of Object.keys(discriminators)) {
const it = discriminators[name];
if (
it.schema &&
it.schema.discriminatorMapping &&
areDiscriminatorValuesEqual(it.schema.discriminatorMapping.value, value)
) {
return it;
}
}
return null;
};

View file

@ -0,0 +1,27 @@
'use strict';
const areDiscriminatorValuesEqual = require('./areDiscriminatorValuesEqual');
/**
* returns discriminator by discriminatorMapping.value
*
* @param {Schema} schema
* @param {string} value
* @api private
*/
module.exports = function getSchemaDiscriminatorByValue(schema, value) {
if (schema == null || schema.discriminators == null) {
return null;
}
for (const key of Object.keys(schema.discriminators)) {
const discriminatorSchema = schema.discriminators[key];
if (discriminatorSchema.discriminatorMapping == null) {
continue;
}
if (areDiscriminatorValuesEqual(discriminatorSchema.discriminatorMapping.value, value)) {
return discriminatorSchema;
}
}
return null;
};

View file

@ -0,0 +1,81 @@
'use strict';
const schemaMerge = require('../schema/merge');
const specialProperties = require('../../helpers/specialProperties');
const isBsonType = require('../../helpers/isBsonType');
const ObjectId = require('../../types/objectid');
const isObject = require('../../helpers/isObject');
/**
* Merges `from` into `to` without overwriting existing properties.
*
* @param {Object} to
* @param {Object} from
* @param {String} [path]
* @api private
*/
module.exports = function mergeDiscriminatorSchema(to, from, path, seen) {
const keys = Object.keys(from);
let i = 0;
const len = keys.length;
let key;
path = path || '';
seen = seen || new WeakSet();
if (seen.has(from)) {
return;
}
seen.add(from);
while (i < len) {
key = keys[i++];
if (!path) {
if (key === 'discriminators' ||
key === 'base' ||
key === '_applyDiscriminators' ||
key === '_userProvidedOptions' ||
key === 'options' ||
key === 'tree') {
continue;
}
}
if (path === 'tree' && from != null && from.instanceOfSchema) {
continue;
}
if (specialProperties.has(key)) {
continue;
}
if (to[key] == null) {
to[key] = from[key];
} else if (isObject(from[key])) {
if (!isObject(to[key])) {
to[key] = {};
}
if (from[key] != null) {
// Skip merging schemas if we're creating a discriminator schema and
// base schema has a given path as a single nested but discriminator schema
// has the path as a document array, or vice versa (gh-9534)
if ((from[key].$isSingleNested && to[key].$isMongooseDocumentArray) ||
(from[key].$isMongooseDocumentArray && to[key].$isSingleNested) ||
(from[key].$isMongooseDocumentArrayElement && to[key].$isMongooseDocumentArrayElement)) {
continue;
} else if (from[key].instanceOfSchema) {
if (to[key].instanceOfSchema) {
schemaMerge(to[key], from[key].clone(), true);
} else {
to[key] = from[key].clone();
}
continue;
} else if (isBsonType(from[key], 'ObjectId')) {
to[key] = new ObjectId(from[key]);
continue;
}
}
mergeDiscriminatorSchema(to[key], from[key], path ? path + '.' + key : key, seen);
}
}
if (from != null && from.instanceOfSchema) {
to.tree = Object.assign({}, from.tree, to.tree);
}
};

View file

@ -0,0 +1,132 @@
'use strict';
const isNestedProjection = require('../projection/isNestedProjection');
module.exports = function applyDefaults(doc, fields, exclude, hasIncludedChildren, isBeforeSetters, pathsToSkip, options) {
const paths = Object.keys(doc.$__schema.paths);
const plen = paths.length;
const skipParentChangeTracking = options && options.skipParentChangeTracking;
for (let i = 0; i < plen; ++i) {
let def;
let curPath = '';
const p = paths[i];
if (p === '_id' && doc.$__.skipId) {
continue;
}
const type = doc.$__schema.paths[p];
const path = type.splitPath();
const len = path.length;
if (path[len - 1] === '$*') {
continue;
}
let included = false;
let doc_ = doc._doc;
for (let j = 0; j < len; ++j) {
if (doc_ == null) {
break;
}
const piece = path[j];
curPath += (!curPath.length ? '' : '.') + piece;
if (exclude === true) {
if (curPath in fields) {
break;
}
} else if (exclude === false && fields && !included) {
const hasSubpaths = type.$isSingleNested || type.$isMongooseDocumentArray;
if ((curPath in fields && !isNestedProjection(fields[curPath])) || (j === len - 1 && hasSubpaths && hasIncludedChildren != null && hasIncludedChildren[curPath])) {
included = true;
} else if (hasIncludedChildren != null && !hasIncludedChildren[curPath]) {
break;
}
}
if (j === len - 1) {
if (doc_[piece] !== void 0) {
break;
}
if (isBeforeSetters != null) {
if (typeof type.defaultValue === 'function') {
if (!type.defaultValue.$runBeforeSetters && isBeforeSetters) {
break;
}
if (type.defaultValue.$runBeforeSetters && !isBeforeSetters) {
break;
}
} else if (!isBeforeSetters) {
// Non-function defaults should always run **before** setters
continue;
}
}
if (pathsToSkip && pathsToSkip[curPath]) {
break;
}
if (fields && exclude !== null) {
if (exclude === true) {
// apply defaults to all non-excluded fields
if (p in fields) {
continue;
}
try {
def = type.getDefault(doc, false);
} catch (err) {
doc.invalidate(p, err);
break;
}
if (typeof def !== 'undefined') {
doc_[piece] = def;
applyChangeTracking(doc, p, skipParentChangeTracking);
}
} else if (included) {
// selected field
try {
def = type.getDefault(doc, false);
} catch (err) {
doc.invalidate(p, err);
break;
}
if (typeof def !== 'undefined') {
doc_[piece] = def;
applyChangeTracking(doc, p, skipParentChangeTracking);
}
}
} else {
try {
def = type.getDefault(doc, false);
} catch (err) {
doc.invalidate(p, err);
break;
}
if (typeof def !== 'undefined') {
doc_[piece] = def;
applyChangeTracking(doc, p, skipParentChangeTracking);
}
}
} else {
doc_ = doc_[piece];
}
}
}
};
/*!
* ignore
*/
function applyChangeTracking(doc, fullPath, skipParentChangeTracking) {
doc.$__.activePaths.default(fullPath);
if (!skipParentChangeTracking && doc.$isSubdocument && doc.$isSingleNested && doc.$parent() != null) {
doc.$parent().$__.activePaths.default(doc.$__pathRelativeToParent(fullPath));
}
}

View file

@ -0,0 +1,105 @@
'use strict';
const handleTimestampOption = require('../schema/handleTimestampOption');
const mpath = require('mpath');
module.exports = applyTimestamps;
/**
* Apply a given schema's timestamps to the given POJO
*
* @param {Schema} schema
* @param {Object} obj
* @param {Object} [options]
* @param {Boolean} [options.isUpdate=false] if true, treat this as an update: just set updatedAt, skip setting createdAt. If false, set both createdAt and updatedAt
* @param {Function} [options.currentTime] if set, Mongoose will call this function to get the current time.
*/
function applyTimestamps(schema, obj, options) {
if (obj == null) {
return obj;
}
applyTimestampsToChildren(schema, obj, options);
return applyTimestampsToDoc(schema, obj, options);
}
/**
* Apply timestamps to any subdocuments
*
* @param {Schema} schema subdocument schema
* @param {Object} res subdocument
* @param {Object} [options]
* @param {Boolean} [options.isUpdate=false] if true, treat this as an update: just set updatedAt, skip setting createdAt. If false, set both createdAt and updatedAt
* @param {Function} [options.currentTime] if set, Mongoose will call this function to get the current time.
*/
function applyTimestampsToChildren(schema, res, options) {
for (const childSchema of schema.childSchemas) {
const _path = childSchema.model.path;
const _schema = childSchema.schema;
if (!_path) {
continue;
}
const _obj = mpath.get(_path, res);
if (_obj == null || (Array.isArray(_obj) && _obj.flat(Infinity).length === 0)) {
continue;
}
applyTimestamps(_schema, _obj, options);
}
}
/**
* Apply timestamps to a given document. Does not apply timestamps to subdocuments: use `applyTimestampsToChildren` instead
*
* @param {Schema} schema
* @param {Object} obj
* @param {Object} [options]
* @param {Boolean} [options.isUpdate=false] if true, treat this as an update: just set updatedAt, skip setting createdAt. If false, set both createdAt and updatedAt
* @param {Function} [options.currentTime] if set, Mongoose will call this function to get the current time.
*/
function applyTimestampsToDoc(schema, obj, options) {
if (obj == null || typeof obj !== 'object') {
return;
}
if (Array.isArray(obj)) {
for (const el of obj) {
applyTimestampsToDoc(schema, el, options);
}
return;
}
if (schema.discriminators && Object.keys(schema.discriminators).length > 0) {
for (const discriminatorKey of Object.keys(schema.discriminators)) {
const discriminator = schema.discriminators[discriminatorKey];
const key = discriminator.discriminatorMapping.key;
const value = discriminator.discriminatorMapping.value;
if (obj[key] == value) {
schema = discriminator;
break;
}
}
}
const createdAt = handleTimestampOption(schema.options.timestamps, 'createdAt');
const updatedAt = handleTimestampOption(schema.options.timestamps, 'updatedAt');
const currentTime = options?.currentTime;
let ts = null;
if (currentTime != null) {
ts = currentTime();
} else if (schema.base?.now) {
ts = schema.base.now();
} else {
ts = new Date();
}
if (createdAt && obj[createdAt] == null && !options?.isUpdate) {
obj[createdAt] = ts;
}
if (updatedAt) {
obj[updatedAt] = ts;
}
}

View file

@ -0,0 +1,146 @@
'use strict';
const mpath = require('mpath');
module.exports = applyVirtuals;
/**
* Apply a given schema's virtuals to a given POJO
*
* @param {Schema} schema
* @param {Object} obj
* @param {Array<string>} [virtuals] optional whitelist of virtuals to apply
* @returns
*/
function applyVirtuals(schema, obj, virtuals) {
if (obj == null) {
return obj;
}
let virtualsForChildren = virtuals;
let toApply = null;
if (Array.isArray(virtuals)) {
virtualsForChildren = [];
toApply = [];
for (const virtual of virtuals) {
if (virtual.length === 1) {
toApply.push(virtual[0]);
} else {
virtualsForChildren.push(virtual);
}
}
}
applyVirtualsToChildren(schema, obj, virtualsForChildren);
return applyVirtualsToDoc(schema, obj, toApply);
}
/**
* Apply virtuals to any subdocuments
*
* @param {Schema} schema subdocument schema
* @param {Object} res subdocument
* @param {Array<String>} [virtuals] optional whitelist of virtuals to apply
*/
function applyVirtualsToChildren(schema, res, virtuals) {
let attachedVirtuals = false;
for (const childSchema of schema.childSchemas) {
const _path = childSchema.model.path;
const _schema = childSchema.schema;
if (!_path) {
continue;
}
const _obj = mpath.get(_path, res);
if (_obj == null || (Array.isArray(_obj) && _obj.flat(Infinity).length === 0)) {
continue;
}
let virtualsForChild = null;
if (Array.isArray(virtuals)) {
virtualsForChild = [];
for (const virtual of virtuals) {
if (virtual[0] == _path) {
virtualsForChild.push(virtual.slice(1));
}
}
if (virtualsForChild.length === 0) {
continue;
}
}
applyVirtuals(_schema, _obj, virtualsForChild);
attachedVirtuals = true;
}
if (virtuals && virtuals.length && !attachedVirtuals) {
applyVirtualsToDoc(schema, res, virtuals);
}
}
/**
* Apply virtuals to a given document. Does not apply virtuals to subdocuments: use `applyVirtualsToChildren` instead
*
* @param {Schema} schema
* @param {Object} doc
* @param {Array<String>} [virtuals] optional whitelist of virtuals to apply
* @returns
*/
function applyVirtualsToDoc(schema, obj, virtuals) {
if (obj == null || typeof obj !== 'object') {
return;
}
if (Array.isArray(obj)) {
for (const el of obj) {
applyVirtualsToDoc(schema, el, virtuals);
}
return;
}
if (schema.discriminators && Object.keys(schema.discriminators).length > 0) {
for (const discriminatorKey of Object.keys(schema.discriminators)) {
const discriminator = schema.discriminators[discriminatorKey];
const key = discriminator.discriminatorMapping.key;
const value = discriminator.discriminatorMapping.value;
if (obj[key] == value) {
schema = discriminator;
break;
}
}
}
if (virtuals == null) {
virtuals = Object.keys(schema.virtuals);
}
for (const virtual of virtuals) {
if (schema.virtuals[virtual] == null) {
continue;
}
const virtualType = schema.virtuals[virtual];
const sp = Array.isArray(virtual)
? virtual
: virtual.indexOf('.') === -1
? [virtual]
: virtual.split('.');
let cur = obj;
for (let i = 0; i < sp.length - 1; ++i) {
cur[sp[i]] = sp[i] in cur ? cur[sp[i]] : {};
cur = cur[sp[i]];
}
let val = virtualType.applyGetters(cur[sp[sp.length - 1]], obj);
const isPopulateVirtual =
virtualType.options && (virtualType.options.ref || virtualType.options.refPath);
if (isPopulateVirtual && val === undefined) {
if (virtualType.options.justOne) {
val = null;
} else {
val = [];
}
}
cur[sp[sp.length - 1]] = val;
}
}

View file

@ -0,0 +1,45 @@
'use strict';
/*!
* ignore
*/
module.exports = function cleanModifiedSubpaths(doc, path, options) {
options = options || {};
const skipDocArrays = options.skipDocArrays;
let deleted = 0;
if (!doc) {
return deleted;
}
for (const modifiedPath of Object.keys(doc.$__.activePaths.getStatePaths('modify'))) {
if (skipDocArrays) {
const schemaType = doc.$__schema.path(modifiedPath);
if (schemaType && schemaType.$isMongooseDocumentArray) {
continue;
}
}
if (modifiedPath.startsWith(path + '.')) {
doc.$__.activePaths.clearPath(modifiedPath);
++deleted;
if (doc.$isSubdocument) {
cleanParent(doc, modifiedPath);
}
}
}
return deleted;
};
function cleanParent(doc, path, seen = new Set()) {
if (seen.has(doc)) {
throw new Error('Infinite subdocument loop: subdoc with _id ' + doc._id + ' is a parent of itself');
}
const parent = doc.$parent();
const newPath = doc.$__pathRelativeToParent(void 0, false) + '.' + path;
parent.$__.activePaths.clearPath(newPath);
if (parent.$isSubdocument) {
cleanParent(parent, newPath, seen);
}
}

View file

@ -0,0 +1,238 @@
'use strict';
const clone = require('../../helpers/clone');
const documentSchemaSymbol = require('../../helpers/symbols').documentSchemaSymbol;
const internalToObjectOptions = require('../../options').internalToObjectOptions;
const utils = require('../../utils');
let Document;
const getSymbol = require('../../helpers/symbols').getSymbol;
const scopeSymbol = require('../../helpers/symbols').scopeSymbol;
const isPOJO = utils.isPOJO;
/*!
* exports
*/
exports.compile = compile;
exports.defineKey = defineKey;
const _isEmptyOptions = Object.freeze({
minimize: true,
virtuals: false,
getters: false,
transform: false
});
const noDottedPathGetOptions = Object.freeze({
noDottedPath: true
});
/**
* Compiles schemas.
* @param {Object} tree
* @param {Any} proto
* @param {String} prefix
* @param {Object} options
* @api private
*/
function compile(tree, proto, prefix, options) {
Document = Document || require('../../document');
const typeKey = options.typeKey;
for (const key of Object.keys(tree)) {
const limb = tree[key];
const hasSubprops = isPOJO(limb) &&
Object.keys(limb).length > 0 &&
(!limb[typeKey] || (typeKey === 'type' && isPOJO(limb.type) && limb.type.type));
const subprops = hasSubprops ? limb : null;
defineKey({ prop: key, subprops: subprops, prototype: proto, prefix: prefix, options: options });
}
}
/**
* Defines the accessor named prop on the incoming prototype.
* @param {Object} options
* @param {String} options.prop
* @param {Boolean} options.subprops
* @param {Any} options.prototype
* @param {String} [options.prefix]
* @param {Object} options.options
* @api private
*/
function defineKey({ prop, subprops, prototype, prefix, options }) {
Document = Document || require('../../document');
const path = (prefix ? prefix + '.' : '') + prop;
prefix = prefix || '';
const useGetOptions = prefix ? Object.freeze({}) : noDottedPathGetOptions;
if (subprops) {
Object.defineProperty(prototype, prop, {
enumerable: true,
configurable: true,
get: function() {
const _this = this;
if (!this.$__.getters) {
this.$__.getters = {};
}
if (!this.$__.getters[path]) {
const nested = Object.create(Document.prototype, getOwnPropertyDescriptors(this));
// save scope for nested getters/setters
if (!prefix) {
nested.$__[scopeSymbol] = this;
}
nested.$__.nestedPath = path;
Object.defineProperty(nested, 'schema', {
enumerable: false,
configurable: true,
writable: false,
value: prototype.schema
});
Object.defineProperty(nested, '$__schema', {
enumerable: false,
configurable: true,
writable: false,
value: prototype.schema
});
Object.defineProperty(nested, documentSchemaSymbol, {
enumerable: false,
configurable: true,
writable: false,
value: prototype.schema
});
Object.defineProperty(nested, 'toObject', {
enumerable: false,
configurable: true,
writable: false,
value: function() {
return clone(_this.get(path, null, {
virtuals: this &&
this.schema &&
this.schema.options &&
this.schema.options.toObject &&
this.schema.options.toObject.virtuals || null
}));
}
});
Object.defineProperty(nested, '$__get', {
enumerable: false,
configurable: true,
writable: false,
value: function() {
return _this.get(path, null, {
virtuals: this && this.schema && this.schema.options && this.schema.options.toObject && this.schema.options.toObject.virtuals || null
});
}
});
Object.defineProperty(nested, 'toJSON', {
enumerable: false,
configurable: true,
writable: false,
value: function() {
return _this.get(path, null, {
virtuals: this && this.schema && this.schema.options && this.schema.options.toJSON && this.schema.options.toJSON.virtuals || null
});
}
});
Object.defineProperty(nested, '$__isNested', {
enumerable: false,
configurable: true,
writable: false,
value: true
});
Object.defineProperty(nested, '$isEmpty', {
enumerable: false,
configurable: true,
writable: false,
value: function() {
return Object.keys(this.get(path, null, _isEmptyOptions) || {}).length === 0;
}
});
Object.defineProperty(nested, '$__parent', {
enumerable: false,
configurable: true,
writable: false,
value: this
});
compile(subprops, nested, path, options);
this.$__.getters[path] = nested;
}
return this.$__.getters[path];
},
set: function(v) {
if (v != null && v.$__isNested) {
// Convert top-level to POJO, but leave subdocs hydrated so `$set`
// can handle them. See gh-9293.
v = v.$__get();
} else if (v instanceof Document && !v.$__isNested) {
v = v.$toObject(internalToObjectOptions);
}
const doc = this.$__[scopeSymbol] || this;
doc.$set(path, v);
}
});
} else {
Object.defineProperty(prototype, prop, {
enumerable: true,
configurable: true,
get: function() {
return this[getSymbol].call(
this.$__[scopeSymbol] || this,
path,
null,
useGetOptions
);
},
set: function(v) {
this.$set.call(this.$__[scopeSymbol] || this, path, v);
}
});
}
}
// gets descriptors for all properties of `object`
// makes all properties non-enumerable to match previous behavior to #2211
function getOwnPropertyDescriptors(object) {
const result = {};
Object.getOwnPropertyNames(object).forEach(function(key) {
const skip = [
'isNew',
'$__',
'$errors',
'errors',
'_doc',
'$locals',
'$op',
'__parentArray',
'__index',
'$isDocumentArrayElement'
].indexOf(key) === -1;
if (skip) {
return;
}
result[key] = Object.getOwnPropertyDescriptor(object, key);
result[key].enumerable = false;
});
return result;
}

View file

@ -0,0 +1,38 @@
'use strict';
/**
* Find the deepest subdocument along a given path to ensure setter functions run
* with the correct subdocument as `this`. If no subdocuments, returns the top-level
* document.
*
* @param {Document} doc
* @param {String[]} parts
* @param {Schema} schema
* @returns Document
*/
module.exports = function getDeepestSubdocumentForPath(doc, parts, schema) {
let curPath = parts[0];
let curSchema = schema;
let subdoc = doc;
for (let i = 0; i < parts.length - 1; ++i) {
const curSchemaType = curSchema.path(curPath);
if (curSchemaType && curSchemaType.schema) {
let newSubdoc = subdoc.get(curPath);
curSchema = curSchemaType.schema;
curPath = parts[i + 1];
if (Array.isArray(newSubdoc) && !isNaN(curPath)) {
newSubdoc = newSubdoc[curPath];
curPath = '';
}
if (newSubdoc == null) {
break;
}
subdoc = newSubdoc;
} else {
curPath += curPath.length ? '.' + parts[i + 1] : parts[i + 1];
}
}
return subdoc;
};

View file

@ -0,0 +1,53 @@
'use strict';
const get = require('../get');
const getSchemaDiscriminatorByValue = require('../discriminator/getSchemaDiscriminatorByValue');
/**
* Like `schema.path()`, except with a document, because impossible to
* determine path type without knowing the embedded discriminator key.
*
* @param {Document} doc
* @param {String|String[]} path
* @param {Object} [options]
* @api private
*/
module.exports = function getEmbeddedDiscriminatorPath(doc, path, options) {
options = options || {};
const typeOnly = options.typeOnly;
const parts = Array.isArray(path) ?
path :
(path.indexOf('.') === -1 ? [path] : path.split('.'));
let schemaType = null;
let type = 'adhocOrUndefined';
const schema = getSchemaDiscriminatorByValue(doc.schema, doc.get(doc.schema.options.discriminatorKey)) || doc.schema;
for (let i = 0; i < parts.length; ++i) {
const subpath = parts.slice(0, i + 1).join('.');
schemaType = schema.path(subpath);
if (schemaType == null) {
type = 'adhocOrUndefined';
continue;
}
if (schemaType.instance === 'Mixed') {
return typeOnly ? 'real' : schemaType;
}
type = schema.pathType(subpath);
if ((schemaType.$isSingleNested || schemaType.$isMongooseDocumentArrayElement) &&
schemaType.schema.discriminators != null) {
const discriminators = schemaType.schema.discriminators;
const discriminatorKey = doc.get(subpath + '.' +
get(schemaType, 'schema.options.discriminatorKey'));
if (discriminatorKey == null || discriminators[discriminatorKey] == null) {
continue;
}
const rest = parts.slice(i + 1).join('.');
return getEmbeddedDiscriminatorPath(doc.get(subpath), rest, options);
}
}
// Are we getting the whole schema or just the type, 'real', 'nested', etc.
return typeOnly ? type : schemaType;
};

View file

@ -0,0 +1,35 @@
'use strict';
const utils = require('../../utils');
const keysToSkip = new Set(['__index', '__parentArray', '_doc']);
/**
* Using spread operator on a Mongoose document gives you a
* POJO that has a tendency to cause infinite recursion. So
* we use this function on `set()` to prevent that.
*/
module.exports = function handleSpreadDoc(v, includeExtraKeys) {
if (utils.isPOJO(v) && v.$__ != null && v._doc != null) {
if (includeExtraKeys) {
const extraKeys = {};
for (const key of Object.keys(v)) {
if (typeof key === 'symbol') {
continue;
}
if (key[0] === '$') {
continue;
}
if (keysToSkip.has(key)) {
continue;
}
extraKeys[key] = v[key];
}
return { ...v._doc, ...extraKeys };
}
return v._doc;
}
return v;
};

View file

@ -0,0 +1,25 @@
'use strict';
module.exports = function each(arr, cb, done) {
if (arr.length === 0) {
return done();
}
let remaining = arr.length;
let err = null;
for (const v of arr) {
cb(v, function(_err) {
if (err != null) {
return;
}
if (_err != null) {
err = _err;
return done(err);
}
if (--remaining <= 0) {
return done();
}
});
}
};

View file

@ -0,0 +1,22 @@
'use strict';
/*!
* ignore
*/
module.exports = function combinePathErrors(err) {
const keys = Object.keys(err.errors || {});
const len = keys.length;
const msgs = [];
let key;
for (let i = 0; i < len; ++i) {
key = keys[i];
if (err === err.errors[key]) {
continue;
}
msgs.push(key + ': ' + err.errors[key].message);
}
return msgs.join(', ');
};

View file

@ -0,0 +1,8 @@
'use strict';
module.exports = function firstKey(obj) {
if (obj == null) {
return null;
}
return Object.keys(obj)[0];
};

View file

@ -0,0 +1,65 @@
'use strict';
/**
* Simplified lodash.get to work around the annoying null quirk. See:
* https://github.com/lodash/lodash/issues/3659
* @api private
*/
module.exports = function get(obj, path, def) {
let parts;
let isPathArray = false;
if (typeof path === 'string') {
if (path.indexOf('.') === -1) {
const _v = getProperty(obj, path);
if (_v == null) {
return def;
}
return _v;
}
parts = path.split('.');
} else {
isPathArray = true;
parts = path;
if (parts.length === 1) {
const _v = getProperty(obj, parts[0]);
if (_v == null) {
return def;
}
return _v;
}
}
let rest = path;
let cur = obj;
for (const part of parts) {
if (cur == null) {
return def;
}
// `lib/cast.js` depends on being able to get dotted paths in updates,
// like `{ $set: { 'a.b': 42 } }`
if (!isPathArray && cur[rest] != null) {
return cur[rest];
}
cur = getProperty(cur, part);
if (!isPathArray) {
rest = rest.substr(part.length + 1);
}
}
return cur == null ? def : cur;
};
function getProperty(obj, prop) {
if (obj == null) {
return obj;
}
if (obj instanceof Map) {
return obj.get(prop);
}
return obj[prop];
}

View file

@ -0,0 +1,16 @@
'use strict';
/**
* If `val` is an object, returns constructor name, if possible. Otherwise returns undefined.
* @api private
*/
module.exports = function getConstructorName(val) {
if (val == null) {
return void 0;
}
if (typeof val.constructor !== 'function') {
return void 0;
}
return val.constructor.name;
};

View file

@ -0,0 +1,18 @@
'use strict';
function getDefaultBulkwriteResult() {
return {
ok: 1,
nInserted: 0,
nUpserted: 0,
nMatched: 0,
nModified: 0,
nRemoved: 0,
upserted: [],
writeErrors: [],
insertedIds: [],
writeConcernErrors: []
};
}
module.exports = getDefaultBulkwriteResult;

View file

@ -0,0 +1,10 @@
'use strict';
const functionNameRE = /^function\s*([^\s(]+)/;
module.exports = function(fn) {
return (
fn.name ||
(fn.toString().trim().match(functionNameRE) || [])[1]
);
};

View file

@ -0,0 +1,16 @@
/*!
* Centralize this so we can more easily work around issues with people
* stubbing out `process.nextTick()` in tests using sinon:
* https://github.com/sinonjs/lolex#automatically-incrementing-mocked-time
* See gh-6074
*/
'use strict';
const nextTick = typeof process !== 'undefined' && typeof process.nextTick === 'function' ?
process.nextTick.bind(process) :
cb => setTimeout(cb, 0); // Fallback for browser build
module.exports = function immediate(cb) {
return nextTick(cb);
};

View file

@ -0,0 +1,13 @@
'use strict';
const isTextIndex = require('./isTextIndex');
module.exports = function applySchemaCollation(indexKeys, indexOptions, schemaOptions) {
if (isTextIndex(indexKeys)) {
return;
}
if (schemaOptions.hasOwnProperty('collation') && !indexOptions.hasOwnProperty('collation')) {
indexOptions.collation = schemaOptions.collation;
}
};

View file

@ -0,0 +1,14 @@
'use strict';
module.exports = function decorateDiscriminatorIndexOptions(schema, indexOptions) {
// If the model is a discriminator and has an index, add a
// partialFilterExpression by default so the index will only apply
// to that discriminator.
const discriminatorName = schema.discriminatorMapping && schema.discriminatorMapping.value;
if (discriminatorName && !('sparse' in indexOptions)) {
const discriminatorKey = schema.options.discriminatorKey;
indexOptions.partialFilterExpression = indexOptions.partialFilterExpression || {};
indexOptions.partialFilterExpression[discriminatorKey] = discriminatorName;
}
return indexOptions;
};

View file

@ -0,0 +1,63 @@
'use strict';
const hasDollarKeys = require('../query/hasDollarKeys');
function getRelatedSchemaIndexes(model, schemaIndexes) {
return getRelatedIndexes({
baseModelName: model.baseModelName,
discriminatorMapping: model.schema.discriminatorMapping,
indexes: schemaIndexes,
indexesType: 'schema'
});
}
function getRelatedDBIndexes(model, dbIndexes) {
return getRelatedIndexes({
baseModelName: model.baseModelName,
discriminatorMapping: model.schema.discriminatorMapping,
indexes: dbIndexes,
indexesType: 'db'
});
}
module.exports = {
getRelatedSchemaIndexes,
getRelatedDBIndexes
};
function getRelatedIndexes({
baseModelName,
discriminatorMapping,
indexes,
indexesType
}) {
const discriminatorKey = discriminatorMapping && discriminatorMapping.key;
const discriminatorValue = discriminatorMapping && discriminatorMapping.value;
if (!discriminatorKey) {
return indexes;
}
const isChildDiscriminatorModel = Boolean(baseModelName);
if (isChildDiscriminatorModel) {
return indexes.filter(index => {
const partialFilterExpression = getPartialFilterExpression(index, indexesType);
return partialFilterExpression && partialFilterExpression[discriminatorKey] === discriminatorValue;
});
}
return indexes.filter(index => {
const partialFilterExpression = getPartialFilterExpression(index, indexesType);
return !partialFilterExpression
|| !partialFilterExpression[discriminatorKey]
|| (hasDollarKeys(partialFilterExpression[discriminatorKey]) && !('$eq' in partialFilterExpression[discriminatorKey]));
});
}
function getPartialFilterExpression(index, indexesType) {
if (indexesType === 'schema') {
const options = index[1];
return options && options.partialFilterExpression;
}
return index.partialFilterExpression;
}

View file

@ -0,0 +1,18 @@
'use strict';
const get = require('../get');
module.exports = function isDefaultIdIndex(index) {
if (Array.isArray(index)) {
// Mongoose syntax
const keys = Object.keys(index[0]);
return keys.length === 1 && keys[0] === '_id' && index[0]._id !== 'hashed';
}
if (typeof index !== 'object') {
return false;
}
const key = get(index, 'key', {});
return Object.keys(key).length === 1 && key.hasOwnProperty('_id');
};

View file

@ -0,0 +1,96 @@
'use strict';
const get = require('../get');
const utils = require('../../utils');
/**
* Given a Mongoose index definition (key + options objects) and a MongoDB server
* index definition, determine if the two indexes are equal.
*
* @param {Object} schemaIndexKeysObject the Mongoose index spec
* @param {Object} options the Mongoose index definition's options
* @param {Object} dbIndex the index in MongoDB as returned by `listIndexes()`
* @api private
*/
module.exports = function isIndexEqual(schemaIndexKeysObject, options, dbIndex) {
// Special case: text indexes have a special format in the db. For example,
// `{ name: 'text' }` becomes:
// {
// v: 2,
// key: { _fts: 'text', _ftsx: 1 },
// name: 'name_text',
// ns: 'test.tests',
// background: true,
// weights: { name: 1 },
// default_language: 'english',
// language_override: 'language',
// textIndexVersion: 3
// }
if (dbIndex.textIndexVersion != null) {
delete dbIndex.key._fts;
delete dbIndex.key._ftsx;
const weights = { ...dbIndex.weights, ...dbIndex.key };
if (Object.keys(weights).length !== Object.keys(schemaIndexKeysObject).length) {
return false;
}
for (const prop of Object.keys(weights)) {
if (!(prop in schemaIndexKeysObject)) {
return false;
}
const weight = weights[prop];
if (weight !== get(options, 'weights.' + prop) && !(weight === 1 && get(options, 'weights.' + prop) == null)) {
return false;
}
}
if (options['default_language'] !== dbIndex['default_language']) {
return dbIndex['default_language'] === 'english' && options['default_language'] == null;
}
return true;
}
const optionKeys = [
'unique',
'partialFilterExpression',
'sparse',
'expireAfterSeconds',
'collation'
];
for (const key of optionKeys) {
if (!(key in options) && !(key in dbIndex)) {
continue;
}
if (key === 'collation') {
if (options[key] == null || dbIndex[key] == null) {
return options[key] == null && dbIndex[key] == null;
}
const definedKeys = Object.keys(options.collation);
const schemaCollation = options.collation;
const dbCollation = dbIndex.collation;
for (const opt of definedKeys) {
if (get(schemaCollation, opt) !== get(dbCollation, opt)) {
return false;
}
}
} else if (!utils.deepEqual(options[key], dbIndex[key])) {
return false;
}
}
const schemaIndexKeys = Object.keys(schemaIndexKeysObject);
const dbIndexKeys = Object.keys(dbIndex.key);
if (schemaIndexKeys.length !== dbIndexKeys.length) {
return false;
}
for (let i = 0; i < schemaIndexKeys.length; ++i) {
if (schemaIndexKeys[i] !== dbIndexKeys[i]) {
return false;
}
if (!utils.deepEqual(schemaIndexKeysObject[schemaIndexKeys[i]], dbIndex.key[dbIndexKeys[i]])) {
return false;
}
}
return true;
};

Some files were not shown because too many files have changed in this diff Show more