Introduction

MongoDB bridges the gap between key-value stores (which are fast and highly scalable) and traditional RDBMS systems (which provide rich queries and deep functionality).

MongoDB (from "humongous") is a scalable, high-performance, open source, document-oriented database.

This project aims to provide an object-mapping layer on top of Mongo to ease common activities such as:

  • Marshalling from Mongo to Groovy/Java types and back again

  • Support for GORM dynamic finders, criteria and named queries

  • Session-managed transactions

  • Validating domain instances backed by the Mongo datastore

Compatibility with GORM for Hibernate

This implementation tries to be as compatible as possible with GORM for Hibernate. In general you can refer to the GORM documentation and the "Domain Classes" section of the reference guide (see the right nav) for usage information.

The following key features are supported by GORM for Mongo:

  • Simple persistence methods

  • Dynamic finders

  • Criteria queries

  • Named queries

  • Inheritance

  • Embedded types

  • Query by example

However, some features are not supported:

  • HQL queries

  • Composite primary keys

  • Many-to-many associations (these can be modelled with a mapping class)

  • Any direct interaction with the Hibernate API

  • Custom Hibernate user types (custom types are allowed with a different API)

There may be other limitations not mentioned here so in general it shouldn’t be expected that an application based on GORM for Hibernate will "just work" without some tweaking involved. Having said that, the large majority of common GORM functionality is supported.

Release Notes

Below are the details of the changes across releases:

7.1

  • Support Apache Groovy 3, and Java 14

  • Upgrade to mongodb-driver-sync 4.3.3

  • Autowire bean by type in the Data Service

  • Compatible only with Grails 5

7.0

  • Support for MongoDB Driver 3.10.0

  • Support for Java 11

  • Removal of RxJava 1.x Module

  • Java 8 Minimum

6.1

  • GORM Data Services Support

  • Package Scanning Constructors

  • Decimal128 Type Support

  • MongoDB 3.4.x Java Driver Support

6.0

  • Multiple Data Sources Support

  • Multi Tenancy Support

  • RxGORM for MongoDB (Using MongoDB Rx Drivers)

  • Unified Configuration model

5.0

  • MongoDB 3.x driver support

  • New Codec Persistence Engine

  • Removal of GMongo

  • Trait based

4.0

  • Grails 3 compatibility

3.0

  • Support for MongoDB 2.6

  • MongoDB 2.6 GeoJSON type support (MultiPoint, MultiLineString, MultiPolygon and GeometryCollection)

  • Support for Maps of embedded entities

  • Flexible index definition

  • Full text search support

  • Support for projections using MongoDB aggregation

  • Size related criteria implemented (sizeEq, sizeLt etc.) on collections

2.0

  • GeoJSON shape support

  • Support for SSL connections

  • Support for MongoDB connection strings

1.3

  • Support for stateless mode to improve read performance

  • Support for dynamically switching which database or collection to persist to at runtime

1.2

MongoDB plugin 1.2 and above requires Grails 2.1.5 or 2.2.1 as a minimum Grails version, if you are using older versions of Grails you will need to stay with 1.1

1.1 GA

  • DBRefs no longer used by default for associations

  • Upgrade to GMongo 1.0 and Spring Data MongoDB 1.1

  • Support for global mapping configuration

1.0 GA

  • Initial feature complete 1.0 release

Upgrade Notes

Dependency Upgrades

GORM 7.1 supports Apache Groovy 3, Java 14, MongoDB Driver 4.3 and Spring 5.3.x.

Each of these underlying components may have changes that require altering your application. These changes are beyond the scope of this documentation.

Default Autowire By Type inside GORM Data Services

A Grails Service (or a bean) inside GORM DataService will default to autowire by-type, For example:

./grails-app/services/example/BookService.groovy

package example

import grails.gorm.services.Service

@Service(Book)
abstract class BookService {

    TestService testRepo

    abstract Book save(String title, String author)

    void doSomething() {
        assert testRepo != null
    }
}

Please note that with autowire by-type as the default, when multiple beans for same type are found the application with throw Exception. Use the Spring `@Qualifier annotation for Fine-tuning Annotation Based Autowiring with Qualifiers.

Getting Started

Basic Setup

To get started with GORM for MongoDB within Grails you need configure it as a dependency in build.gradle:

dependencies {
    compile 'org.grails.plugins:mongodb:8.2.0'
}

Common Errors

If you receive an error that indicates a failure to resolve the grails-datastore-simple dependency you may need to add the following to build.gradle directly above the dependencies block:

build.gradle
configurations.all {
    exclude module:'grails-datastore-simple'
}

If you receive an error at runtime such as:

Caused by: org.bson.codecs.configuration.CodecConfigurationException: Can't find a codec for class org.bson.BsonDecimal128.
        at org.bson.codecs.configuration.CodecCache.getOrThrow(CodecCache.java:46)
        at org.bson.codecs.configuration.ProvidersCodecRegistry.get(ProvidersCodecRegistry.java:63)
        at org.bson.codecs.configuration.ChildCodecRegistry.get(ChildCodecRegistry.java:51)
        at org.bson.codecs.BsonTypeCodecMap.<init>(BsonTypeCodecMap.java:44)
        at org.bson.codecs.BsonDocumentCodec.<init>(BsonDocumentCodec.java:65)

It means you have an older version of the MongoDB Java driver on your classpath and you should add the following declaration to build.gradle to ensure the dependency is correct:

build.gradle
compile "org.mongodb:mongodb-driver:4.11.1"

Configuring MongoDB

With that done you need to set up a running MongoDB server. Refer to the MongoDB Documentation for an explanation on how to startup a MongoDB instance. Once installed, starting MongoDB is typically a matter of executing the following command:

MONGO_HOME/bin/mongod

With the above command executed in a terminal window you should see output like the following appear:

2015-11-18T19:38:50.073+0100 I JOURNAL  <<initandlisten>> journal dir=/data/db/journal
2015-11-18T19:38:50.073+0100 I JOURNAL  <<initandlisten>> recover : no journal files present, no recovery needed
2015-11-18T19:38:50.090+0100 I JOURNAL  <<durability>> Durability thread started
2015-11-18T19:38:50.090+0100 I JOURNAL  <<journal writer>> Journal writer thread started
2015-11-18T19:38:50.090+0100 I CONTROL  <<initandlisten>> MongoDB starting : pid=52540 port=27017 dbpath=/data/db 64-bit host=Graemes-iMac.local
2015-11-18T19:38:50.090+0100 I CONTROL  <<initandlisten>> ** WARNING: You are running this process as the root user, which is not recommended.
2015-11-18T19:38:50.090+0100 I CONTROL  <<initandlisten>>
2015-11-18T19:38:50.090+0100 I CONTROL  <<initandlisten>>
2015-11-18T19:38:50.090+0100 I CONTROL  <<initandlisten>> ** WARNING: soft rlimits too low. Number of files is 256, should be at least 1000
2015-11-18T19:38:50.090+0100 I CONTROL  <<initandlisten>> db version v3.0.4
2015-11-18T19:38:50.090+0100 I CONTROL  <<initandlisten>> git version: 0481c958daeb2969800511e7475dc66986fa9ed5
2015-11-18T19:38:50.090+0100 I CONTROL  <<initandlisten>> build info: Darwin mci-osx108-11.build.10gen.cc 12.5.0 Darwin Kernel Version 12.5.0: Sun Sep 29 13:33:47 PDT 2013; root:xnu-2050.48.12~1/RELEASE_X86_64 x86_64 BOOST_LIB_VERSION=1_49
2015-11-18T19:38:50.090+0100 I CONTROL  <<initandlisten>> allocator: system
2015-11-18T19:38:50.090+0100 I CONTROL  <<initandlisten>> options: {}
2015-11-18T19:38:50.176+0100 I NETWORK  <<initandlisten>> waiting for connections on port 27017

As you can see the server is running on port 27017, but don’t worry the MongoDB plugin for Grails will automatically configure itself to look for MongoDB on that port by default.

If you want to configure how Grails connects to MongoDB then you can do so using the following settings in grails-app/conf/application.yml:

grails:
    mongodb:
        host: "localhost"
        port: 27017
        username: "blah"
        password: "blah"
        databaseName: "foo"

Using MongoDB Standalone

If you plan to use MongoDB as your primary datastore then you need to remove the Hibernate plugin from the build.gradle file by commenting out the hibernate line in the plugins block

compile 'org.grails.plugins:hibernate'

With this done all domain classes in grails-app/domain will be persisted via MongoDB and not Hibernate. You can create a domain class by running the regular create-domain-class command:

grails create-domain-class Person

The Person domain class will automatically be a persistent entity that can be stored in MongoDB.

Combining MongoDB and Hibernate

If you have both the Hibernate and Mongo plugins installed then by default all classes in the grails-app/domain directory will be persisted by Hibernate and not Mongo. If you want to persist a particular domain class with Mongo then you must use the mapWith property in the domain class:

static mapWith = "mongo"

Advanced Configuration

Mongo Database Connection Configuration

As mentioned the GORM for MongoDB plugin will configure all the defaults for you, but if you wish to customize those defaults you can do so in the grails-app/conf/application.groovy file:

grails {
    mongodb {
        host = "localhost"
        port = 27017
        username = "blah"
        password = "blah"
        databaseName = "foo"
    }
}

The databaseName setting configures the default database name. If not specified the databaseName will default to the name of your application.

You can also customize the MongoDB connection settings using an options block:

grails {
    mongodb {
        options {
            autoConnectRetry = true
            connectTimeout = 300
        }
    }
}

Available options and their descriptions are defined in the MongoClientOptions javadoc.

MongoDB Connection Strings

Since 2.0, you can also use MongoDB connection strings to configure the connection:

grails {
    mongodb {
        url = "mongodb://localhost/mydb"
    }
}

Using MongoDB connection strings is currently the most flexible and recommended way to configure MongoDB connections.

Configuration Options Guide

Below is a complete example showing all configuration options:

grails {
    mongodb {
        databaseName = "myDb" // the default database name
        host = "localhost" // the host to connect to
        port = 27017 // the port to connect to
        username = ".." // the username to connect with
        password = ".." // the password to connect with
        stateless = false // whether to use stateless sessions by default

        // Alternatively, using  'url'
        // url = "mongodb://localhost/mydb"

        options {
            connectionsPerHost = 10 // The maximum number of connections allowed per host
            threadsAllowedToBlockForConnectionMultiplier = 5
            maxWaitTime = 120000 // Max wait time of a blocking thread for a connection.
            connectTimeout = 0 // The connect timeout in milliseconds. 0 == infinite
            socketTimeout = 0 // The socket timeout. 0 == infinite
            socketKeepAlive = false // Whether or not to have socket keep alive turned on
            writeConcern = new com.mongodb.WriteConcern(0, 0, false) // Specifies the number of servers to wait for on the write operation, and exception raising behavior
            sslEnabled = false // Specifies if the driver should use an SSL connection to Mongo
            socketFactory = ... // Specifies the SocketFactory to use for creating connections
        }
    }
}

Global Mapping Configuration

Using the grails.mongodb.default.mapping setting in grails-app/conf/application.groovy you can configure global mapping options across your domain classes. This is useful if, for example, you want to disable optimistic locking globally or you wish to use DBRefs in your association mappings. For example, the following configuration will disable optimistic locking globally and use DBRefs for all properties:

grails.mongodb.default.mapping = {
    version false
    '*'(reference:true)
}

The * method is used to indicate that the setting applies to all properties.

Using GORM in Spring Boot

To use GORM for MongoDB in Spring Boot add the necessary dependencies to your Boot application:

compile("org.grails:gorm-mongodb-spring-boot:8.2.0")

Ensure your Boot Application class is annotated with ComponentScan, example:

import org.springframework.boot.SpringApplication
import org.springframework.boot.autoconfigure.EnableAutoConfiguration
import org.springframework.context.annotation.*

@Configuration
@EnableAutoConfiguration
@ComponentScan
class Application {
    static void main(String[] args) {
        SpringApplication.run Application, args
    }
}
Using ComponentScan without a value results in Boot scanning for classes in the same package or any package nested within the Application class package. If your GORM entities are in a different package specify the package name as the value of the ComponentScan annotation.

Finally create your GORM entities and ensure they are annotated with grails.persistence.Entity:

import grails.persistence.*

@Entity
class Person {
    String firstName
    String lastName
}

GORM for MongoDB without Grails

If you wish to use GORM for MongoDB outside of a Grails application you should declare the necessary dependencies, for example in Gradle:

compile "org.grails:grails-datastore-gorm-mongodb:8.2.0"

Then annotate your entities with the grails.gorm.annotation.Entity annotation:

@Entity
class Person {
    String name
}

Then you need to place the bootstrap logic somewhere in the loading sequence of your application which uses MongoDatastore:

def datastore = new MongoDatastore(Person)

println Person.count()

For configuration you can either pass a map or an instance of the org.springframework.core.env.PropertyResolver interface:

def initializer = new MongoDatastore(['grails.mongodb.url':'http://myserver'], Person)

println Person.count()

If you are using Spring with an existing ApplicationContext you can instead call MongoDbDataStoreSpringInitializer.configureForBeanDefinitionRegistry prior to refreshing the context. You can pass the Spring Environment object to the constructor for configuration:

ApplicationContext myApplicationContext = ...
def initializer = new MongoDbDataStoreSpringInitializer(myApplicationContext.getEnvironment(), Person)
initializer.configureForBeanDefinitionRegistry(myApplicationContext)

println Person.count()

Mapping Domain Classes

Basic Mapping

The way GORM for MongoDB works is to map each domain class to a Mongo collection. For example given a domain class such as:

class Person {
    String firstName
    String lastName
    static hasMany = [pets:Pet]
}

This will map onto a MongoDB Collection called "person".

Embedded Documents

It is quite common in MongoDB to embed documents within documents (nested documents). This can be done with GORM embedded types:

class Person {
    String firstName
    String lastName
    Address address
    static embedded = ['address']
}

You can map embedded lists and sets of documents/domain classes:

class Person {
    String firstName
    String lastName
    Address address
    List otherAddresses
    static embedded = ['address', 'otherAddresses']
}

You can also embed maps of embedded classes where the keys are strings:

class Person {
    String firstName
    String lastName
    Map<String,Address> addresses
    static embedded = ['addresses']
}

Basic Collection Types

You can also map lists and maps of basic types (such as strings) simply by defining the appropriate collection type:

class Person {
    List<String> friends
    Map pets
}

...

new Person(friends:['Fred', 'Bob'], pets:[chuck:"Dog", eddie:'Parrot']).save(flush:true)

Basic collection types are stored as native ArrayList and BSON documents within the Mongo documents.

Customized Collection and Database Mapping

You may wish to customize how a domain class maps onto a MongoCollection. This is possible using the mapping block as follows:

class Person {
    ..
    static mapping = {
        collection "mycollection"
        database "mydb"
    }
}

In this example we see that the Person entity has been mapped to a collection called "mycollection" in a database called "mydb".

You can also control how an individual property maps onto a Mongo Document field (the default is to use the property name itself):

class Person {
    ..
    static mapping = {
        firstName attr:"first_name"
    }
}

If you are using the mapping engine, for non-embedded associations by default GORM for MongoDB will map links between documents using MongoDB database references also known as DBRefs.

If you prefer not to use DBRefs then you tell GORM to use direct links by using the reference:false mapping:

class Person {
    ..
    static mapping = {
        address reference:false
    }
}

Identity Generation

By default in GORM entities are supplied with an integer-based identifier. So for example the following entity:

class Person {}

Has a property called id of type java.lang.Long. In this case GORM for Mongo will generate a sequence based identifier using the technique described in the Mongo documentation on Atomic operations.

However, sequence based integer identifiers are not ideal for environments that require sharding (one of the nicer features of Mongo). Hence it is generally advised to use either String based ids:

class Person {
    String id
}

Or a native BSON ObjectId:

import org.bson.types.ObjectId

class Person {
    ObjectId id
}

BSON ObjectId instances are generated in a similar fashion to UUIDs.

Assigned Identifiers

Note that if you manually assign an identifier, then you will need to use the insert method instead of the save method, otherwise GORM can’t work out whether you are trying to achieve an insert or an update. Example:

class Person {
    String id
}
...
Person p = new Person()
p.id = "Fred"
// to insert
p.insert()
// to update
p.save()

Understanding Dirty Checking

In order to be as efficient as possible when it comes to generating updates GORM for MongoDb will track changes you make to persistent instances.

When an object is updated only the properties or associations that have changed will be updated.

You can check whether a given property has changed by using the hasChanged method:

if( person.hasChanged('firstName') ) {
   // do something
}

This method is defined by the org.grails.datastore.mapping.dirty.checking.DirtyCheckable trait.

In the case of collections and association types GORM for MongoDB will wrap each collection in a dirty checking aware collection type.

One of the implications of this is if you override the collection with a non-dirty checking aware type it can disable dirty checking and prevent the property from being updated.

If any of your updates are not updating the properties that you anticipate you can force an update using the markDirty() method:

person.markDirty('firstName')

This will force GORM for MongoDB to issue an update for the given property name.

Dirty Checking and Proxies

Dirty checking uses the equals() method to determine if a property has changed. In the case of associations, it is important to recognize that if the association is a proxy, comparing properties on the domain that are not related to the identifier will initialize the proxy, causing another database query.

If the association does not define equals() method, then the default Groovy behavior of verifying the instances are the same will be used. Because proxies are not the same instance as an instance loaded from the database, which can cause confusing behavior. It is recommended to implement the equals() method if you need to check the dirtiness of an association. For example:

class Author {
    Long id
    String name

     /**
     * This ensures that if either or both of the instances
     * have a null id (new instances), they are not equal.
     */
    @Override
    boolean equals(o) {
        if (!(o instanceof Author)) return false
        if (this.is(o)) return true
        Author that = (Author) o
        if (id !=null && that.id !=null) return id == that.id
        return false
    }
}

class Book {
    Long id
    String title
    Author author
}

Querying Indexing

Basics

MongoDB doesn’t require that you specify indices to query, but like a relational database without specifying indices your queries will be significantly slower.

With that in mind it is important to specify the properties you plan to query using the mapping block:

class Person {
    String name
    static mapping = {
        name index:true
    }
}

With the above mapping a MongoDB index will be automatically created for you. You can customize the index options using the indexAttributes configuration parameter:

class Person {
    String name
    static mapping = {
        name index:true, indexAttributes: [unique:true, dropDups:true]
    }
}

You can use MongoDB Query Hints by passing the hint argument to any dynamic finder:

def people = Person.findByName("Bob", [hint:[name:1]])

Or in a criteria query using the query "arguments" method

Person.withCriteria {
        eq 'firstName', 'Bob'
    arguments hint:[1][firstName] }

Compound Indices

MongoDB supports the notion of compound keys. GORM for MongoDB enables this feature at the mapping level using the compoundIndex mapping:

class Person {
    ...
    static mapping = {
        compoundIndex name:1, age:-1
    }
}

As per the MongoDB docs 1 is for ascending and -1 is for descending.

Indexing using the index method

In addition to the convenience features described above you can use the index method to define any index you want. For example:

static mapping = {
    index( [1],[person.address.postCode] [unique:true] )
}

In the above example I define an index on an embedded attribtue of the document. In fact what arguments you pass to the index method get passed to the underlying MongoDB createIndex method.

Customizing the WriteConcern

A feature of MongoDB is its ability to customize how important a database write is to the user. The Java client models this as a WriteConcern and there are various options that indicate whether the client cares about server or network errors, or whether the data has been successfully written or not.

If you wish to customize the WriteConcern for a domain class you can do so in the mapping block:

import com.mongodb.WriteConcern

class Person {
    String name
    static mapping = {
        writeConcern WriteConcern.FSYNC_SAFE
    }
}
For versioned entities, if a lower level of WriteConcern than WriteConcern.ACKNOWLEDGE is specified, WriteConcern.ACKNOWLEDGE will also be used for updates, to ensure that optimistic locking failures are reported.

Dynamic Attributes

Unlike a relational database, MongoDB allows for "schemaless" persistence where there are no limits to the number of attributes a particular document can have. A GORM domain class on the other hand has a schema in that there are a fixed number of properties. For example consider the following domain class:

class Plant {
    boolean goesInPatch
    String name
}

Here there are two fixed properties, name and goesInPatch, that will be persisted into the MongoDB document. Using GORM for MongoDB you can however use dynamic properties via the Groovy subscript operator. For example:

def p = new Plant(name:"Pineapple")
p['color'] = 'Yellow'
p['hasLeaves'] = true
p.save()

p = Plant.findByName("Pineapple")

println p['color']
println p['hasLeaves']

Using the subscript operator you can add additional attributes to the underlying Document instance that gets persisted to the MongoDB allowing for more dynamic domain models.

Custom User Types

GORM for MongoDB will persist all common known Java types like String, Integer, URL etc., however if you want to persist one of your own classes that is not a domain class you can implement a custom user type.

Custom Codecs

GORM for MongoDB is built ontop of MongoDB’s BSON encoding framework. This means it is possible to implement custom Codecs for encoding and decoding values to and from BSON.

For example consider the following simple Groovy class:

class Birthday {
    Date date
}

By default the encoding engine does not know how to represent this type as a BSON value. To make the encoding engine understand this type you have to implement a custom codec:

import org.bson.*
import org.bson.codecs.*

class BirthdayCodec implements Codec<Birthday> {
    Birthday decode(BsonReader reader, DecoderContext decoderContext) {
        return new Birthday(date: new Date(reader.readDateTime())) (1)
    }
    void encode(BsonWriter writer, Birthday value, EncoderContext encoderContext) {
        writer.writeDateTime(value.date.time) (2)
    }
    Class<Birthday> getEncoderClass() { Birthday } (3)
}
1 Decodes the Birthday type from the BsonReader
2 Encodes the Birthday type to the BsonWriter
3 Returns the type that is to be encoded. In this case Birthday.

With that done you then need to register the custom Codec. There are two ways to achieve this.

You can register a list of codecs in the grails.mongodb.codecs setting in application.yml:

grails:
    mongodb:
        codecs:
            - my.company.BirthdayCodec

Or you can create a META-INF/services/org.bson.codecs.Codec file containing the fully qualified class name of the Codec. If there are multiple codec classes you would like to register, put each one on a separate line.

Custom Types with GORM

Another option is to define a GORM custom type. For example consider the following class:

class Birthday implements Comparable{
    Date date

    Birthday(Date date) {
        this.date = date
    }

    @Override
    int compareTo(Object t) {
        date.compareTo(t.date)
    }
}
Custom types should go in src/groovy not grails-app/domain

If you attempt to reference this class from a domain class it will not automatically be persisted for you. However you can create a custom type implementation and register it with Spring. For example:

import groovy.transform.InheritConstructors
import org.bson.Document
import org.grails.datastore.mapping.engine.types.AbstractMappingAwareCustomTypeMarshaller
import org.grails.datastore.mapping.model.PersistentProperty
import org.grails.datastore.mapping.mongo.query.MongoQuery
import org.grails.datastore.mapping.query.Query

@InheritConstructors
class BirthdayType extends AbstractMappingAwareCustomTypeMarshaller<Birthday, Document, Document> {
   @Override
   protected Object writeInternal(PersistentProperty property, String key, Birthday value, Document nativeTarget) {
       final converted = value.date.time
       nativeTarget.put(key, converted)
       return converted
   }

   @Override
   protected void queryInternal(PersistentProperty property, String key, PropertyCriterion criterion, Document nativeQuery) {
       if (criterion instanceof Between) {
           def dbo = new BasicDBObject()
           dbo.put(MongoQuery.MONGO_GTE_OPERATOR, criterion.getFrom().date.time)
           dbo.put(MongoQuery.MONGO_LTE_OPERATOR, criterion.getTo().date.time)
           nativeQuery.put(key, dbo)
       }
       else {
           nativeQuery.put(key, criterion.value.date.time)
       }
   }

   @Override
   protected Birthday readInternal(PersistentProperty property, String key, Document nativeSource) {
       final num = nativeSource.get(key)
       if (num instanceof Long) {
           return new Birthday(new Date(num))
       }
       return null
   }
})

The above BirthdayType class is a custom user type implementation for MongoDB for the Birthday class. It provides implementations for three methods: readInternal, writeInternal and the optional queryInternal. If you do not implement queryInternal your custom type can be persisted but not queried.

The writeInternal method gets passed the property, the key to store it under, the value and the native DBObject where the custom type is to be stored:

@Override
protected Object writeInternal(PersistentProperty property, String key, Birthday value, DBObject nativeTarget) {
    final converted = value.date.time
    nativeTarget.put(key, converted)
    return converted
}

You can then read the values of the custom type and register them with the DBObject. The readInternal method gets passed the PersistentProperty, the key the user type info is stored under (although you may want to use multiple keys) and the DBObject:

@Override
protected Birthday readInternal(PersistentProperty property, String key, Document nativeSource) {
    final num = nativeSource.get(key)
    if(num instanceof Long) {
        return new Birthday(new Date(num))
    }
    return null
}

You can then construct the custom type by reading values from the DBObject. Finally the queryInternal method allows you to handle how a custom type is queried:

@Override
protected void queryInternal(PersistentProperty property, String key, Query.PropertyCriterion criterion, Document nativeQuery) {
    if(criterion instanceof Between) {
        def dbo = new BasicDBObject()
        dbo.put(MongoQuery.MONGO_GTE_OPERATOR, criterion.getFrom().date.time);
        dbo.put(MongoQuery.MONGO_LTE_OPERATOR, criterion.getTo().date.time);
        nativeQuery.put(key, dbo)
    }
    else if(criterion instanceof Equals){
        nativeQuery.put(key, criterion.value.date.time)
    }
    else {
            throw new RuntimeException("unsupported query type for property $property")
    }
}

The method gets passed a criterion which is the type of query and depending on the type of query you may handle the query differently. For example the above implementation supports between and equals style queries. So the following 2 queries will work:

Person.findByBirthday(new Birthday(new Date()-7)) // find someone who was born 7 days ago
Person.findByBirthdayBetween(new Birthday(new Date()-7), new Birthday(new Date())) // find someone who was born in the last 7 days

However "like" or other query types will not work.

To register a custom type in a grails application simply register it as Spring bean. For example, to register the above BirthdayType add the following to grails-app/conf/spring/resources.groovy:

import com.example.*

// Place your Spring DSL code here
beans = {
  birthdayType(BirthdayType, Birthday)
}

Querying

Basic Querying

GORM for MongoDB supports all of the regular methods for executing GORM queries apart from HQL, which is a Hibernate specific query language more appropriate for SQL databases.

If you wish to execute a native MongoDB query you can use the find method that takes a Bson argument. For example:

import com.mongodb.client.FindIterable
import static com.mongodb.client.model.Filters.*
...
FindIterable findIterable = Product.find(eq("title", "coffee"))
findIterable.limit(10)
            .each { Product product ->
    println "Product title $product.title"
}

The find method will return a FindIterable instance that you can then use to further customize via filters, sorting and projections.

For the full MongoDB client model refer to the com.mongodb.client.model package.

The find method will return instances of your domain class for each query. If you wish to instead obtain MongoDB Document instance then you should use the collection property of the domain class:

import com.mongodb.client.FindIterable
import static com.mongodb.client.model.Filters.*
...
Document doc = Product.collection
                        .find(eq("title", "coffee"))
                        .first()

Geospacial Querying

MongoDB supports storing Geospacial data in both flat and spherical surface types.

To store data in a flat surface you use a "2d" index, whilst a "2dsphere" index used for spherical data. GORM for MongoDB supports both and the following sections describe how to define and query Geospacial data.

Geospacial 2D Sphere Support

Using a 2dsphere Index

MongoDB’s 2dsphere indexes support queries that calculate geometries on an earth-like sphere.

Although you can use coordinate pairs in a 2dsphere index, they are considered legacy by the MongoDB documentation and it is recommended you store data using GeoJSON Point types.

MongoDB legacy coordinate pairs are in latitude / longitude order, whilst GeoJSON points are stored in longitude / latitude order!

To support this GORM for MongoDB features a special type, grails.mongodb.geo.Point, that can be used within domain classes to store geospacial data:

import grails.mongodb.geo.*
...
class Restaurant {
    ObjectId id
    Point location

    static mapping = {
        location geoIndex:'2dsphere'
    }
}

The Point type gets persisted as a GeoJSON Point. A Point can be constructed from coordinates represented in longitude and latitude (the inverse of 2d index location coordinates!). Example:

Restaurant r = new Restaurant(location: new Point(50, 50))
r.id = "Dan's Burgers"
r.save(flush:true)

Restaurant.findByLocation(new Point(50,50))

Querying a 2dsphere Index

Once the 2dsphere index is in place you can use various MongoDB plugin specific dynamic finders to query, including:

  • findBy…​GeoWithin - Find out whether a Point is within a Box, Polygon, Circle or Sphere

  • findBy…​GeoIntersects - Find out whether a Point is within a Box, Polygon, Circle or Sphere

  • findBy…​Near - Find out whether any GeoJSON Shape is near the given Point

  • findBy…​NearSphere - Find out whether any GeoJSON Shape is near the given Point using spherical geometry.

Some examples:

Restaurant.findByLocationGeoWithin( Polygon.valueOf([ [0, 0], [100, 0], [100, 100], [0, 100], [0, 0] ]) )
Restaurant.findByLocationGeoWithin( Box.valueOf( [[25, 25], [100, 100]] ) )
Restaurant.findByLocationGeoWithin( Circle.valueOf( [[50, 50], 100] ) )
Restaurant.findByLocationGeoWithin( Sphere.valueOf( [[50, 50], 0.06]) )
Restaurant.findByLocationNear( Point.valueOf( 40, 40 ) )
Note that a Sphere differs from a Circle in that the radius is specified in radians. There is a special Distance class that can help with radian calculation.

Native Querying Support

In addition to being able to pass any Shape to geospacial query methods you can also pass a map that represents the native values to be passe to the underlying query. For example:

def results = Restaurant.findAllByLocationNear( [$geometry: [type:'Point', coordinates: [1,7]], $maxDistance:30000] )

In the above example the native query parameters are simply passed to the $near query

Geospacial 2D Index Support

MongoDB supports 2d indexes that store points on a two-dimensional plane. although they are considered legacy and you should use 2dsphere indexes instead.

It is possible to use a MongoDB 2d index by mapping a list or map property using the geoIndex mapping:

class Hotel {
    String name
    List location

    static mapping = {
        location geoIndex:'2d'
    }
}

By default the index creation assumes latitude/longitude and thus is configured for a -180..180 range. If you are indexing something else you can customise this with indexAttributes

class Hotel {
    String name
    List location

    static mapping = {
        location geoIndex:'2d', indexAttributes:[min:-500, max:500]
    }
}

You can then save Geo locations using a two dimensional list:

new Hotel(name:"Hilton", location:[50, 50]).save()

Alternatively you can use a map with keys representing latitude and longitude:

new Hotel(name:"Hilton", location:[lat: 40.739037d, long: 73.992964d]).save()
You must specify whether the number of a floating point or double by adding a d or f at the end of the number eg. 40.739037d. Groovy’s default type for decimal numbers is BigDecimal which is not supported by MongoDB.

Once you have your data indexed you can use MongoDB specific dynamic finders to find hotels near a given a location:

def h = Hotel.findByLocationNear([50, 60])
assert h.name == 'Hilton'

You can also find a location within a box (bound queries). Boxes are defined by specifying the lower-left and upper-right corners:

def box = [[40.73083d, -73.99756d], [40.741404d,  -73.988135d]]
def h = Hotel.findByLocationWithinBox(box)

You can also find a location within a circle. Circles are specified using a center and radius:

def center = [50, 50]
def radius = 10
def h = Hotel.findByLocationWithinCircle([center, radius])

If you plan on querying a location and some other value it is recommended to use a compound index:

class Hotel {
    String name
    List location
    int stars

    static mapping = {
        compoundIndex location:"2d", stars:1
    }
}

In the example above you an index is created for both the location and the number of stars a Hotel has.

GeoJSON Data Models

You can also store any GeoJSON shape using the grails.mongodb.geo.Shape super class:

import grails.mongodb.geo.*
...
class Entry {
    ObjectId id
    Shape shape

    static mapping = {
        shape geoIndex:'2dsphere'
    }
}
...
new Entry(shape: Polygon.valueOf([[[3, 1], [1, 2], [5, 6], [9, 2], [4, 3], [3, 1]]]) ).save()
new Entry(shape: LineString.valueOf([[5, 2], [7, 3], [7, 5], [9, 4]]) ).save()
new Entry(shape: Point.valueOf([5, 2])).save()

And then use the findBy*GeoIntersects method to figure out whether shapes intersect with each other:

assert Entry.findByShapeGeoIntersects( Polygon.valueOf( [[ [0,0], [3,0], [3,3], [0,3], [0,0] ]] ) )
assert Entry.findByShapeGeoIntersects( LineString.valueOf( [[1,4], [8,4]] ) )

Full Text Search

Using MongoDB 2.6 and above you can create full text search indices.

To create a "text" index using the index method inside the mapping block:

class Product {
    ObjectId id
    String title

    static mapping = {
        index title:"text"
    }
}

You can then search for instances using the search method:

assert Product.search("bake coffee cake").size() == 10
assert Product.search("bake coffee -cake").size() == 6

You can search for the top results by rank using the searchTop method:

assert Product.searchTop("cake").size() == 4
assert Product.searchTop("cake",3).size() == 3

And count the number of hits with the countHits method:

assert Product.countHits('coffee') == 5

Multiple Data Sources

GORM for MongoDB supports the notion of multiple data sources where multiple individual MongoClient instances can be configured and switched between.

Configuring Multiple Mongo Clients

To configure multiple Mongo client connections you need to use the grails.mongodb.connections setting. For example in application.yml:

grails-app/conf/application.yml
grails:
    mongodb:
        url: mongodb://localhost/books
        connections:
            moreBooks:
                url: mongodb://localhost/moreBooks
            evenMoreBooks:
                url: mongodb://localhost/moreBooks

You can configure individual settings for each Mongo client. If a setting is not specified by default the setting is inherited from the default Mongo client.

Mapping Domain Classes to Mongo Clients

If a domain class has no specify Mongo client connection configuration then the default is used.

You can set the connection method in the mapping block to configure an alternate Mongo Client.

For example, if you want to use the ZipCode domain to use a Mongo client connection called 'lookup', configure it like this:

class ZipCode {

   String code

   static mapping = {
      connection 'lookup'
   }
}

A domain class can also use two or more configured Mongo client connections by using the connections method with a list of names to configure more than one, for example:

class ZipCode {

   String code

   static mapping = {
      connections(['lookup', 'auditing'])
   }
}

If a domain class uses the default connection and one or more others, you can use the ConnectionSource.DEFAULT constant to indicate that:

import org.grails.datastore.mapping.core.connections.*

class ZipCode {

   String code

   static mapping = {
      connections(['lookup', ConnectionSource.DEFAULT])
   }
}

If a domain class uses all configured DataSource instances use the value ALL:

import org.grails.datastore.mapping.core.connections.*

class ZipCode {

   String code

   static mapping = {
      connection ConnectionSource.ALL
   }
}

Switching between Mongo Clients

You can switch to a different connection at runtime with the withConnection method:

Book.withConnection("moreBooks") {
    Book.list()
}

Any logic executed within the body of the closure will use the alternate connection. Once the close finishes execution GORM will switch back to the default connection automatically.

The ConnectionSources API

Introduced in GORM 6.0, the ConnectionSources API allows you to introspect the data sources configured for the application:

@Autowired
MongoDatastore mongoDatastore
...
ConnectionSources<MongoClient, MongoConnectionSourceSettings> connectionSources
                                        = mongoDatastore.getConnectionSources()

for(ConnectionSource<MongoClient, MongoConnectionSourceSettings> connectionSource in connectionSources) {
        println "Name $connectionSource.name"
        MongoClient mongoClient = connectionSource.source
}

Switching Database or Collection at Runtime

In addition to storing dynamic attributes, as of version 1.3.0 of the plugin you can also switch which database and/or collection to persist to at runtime.

For example:

Person.withDatabase("administrators") {
    new Person(name:"Bob").save()
}

The above example will save a Person instance to the administrators database. The database is used for the scope of the closure. You can switch database for the scope of the active session:

Person.useDatabase("administrators")
new Person(name:"Bob").save()

In addition, there are equivalent withCollection and useCollection methods for switching collection at runtime.

Multi-Tenancy

GORM for MongoDb supports the following multi-tenancy modes:

  • DATABASE - A separate database with a separate connection pool is used to store each tenants data.

  • SCHEMA - The same database, but different schemas are used to store each tenants data.

  • DISCRIMINATOR - The same database is used with a discriminator used to partition and isolate data.

Configuring Multi Tenancy

You can configure Multi-Tenancy the same way described in the GORM for Hibernate documenation, simply specify a multi tenancy mode and resolver:

grails:
    gorm:
        multiTenancy:
            mode: DATABASE
            tenantResolverClass: org.grails.datastore.mapping.multitenancy.web.SubDomainTenantResolver

Note that if you are using MongoDB and Hibernate together the above configuration will configure both MongoDB and Hibernate to use a multi-tenancy mode of DATABASE.

If you only want to enable multi-tenancy for MongoDB only you can use the following configuration instead:

grails:
    mongodb:
        multiTenancy:
            mode: DATABASE
            tenantResolverClass: org.grails.datastore.mapping.multitenancy.web.SubDomainTenantResolver

Multi-Tenancy Transformations

The following transformations can be applied to any class to simplify greatly the development of Multi-Tenant applications. These include:

  • @CurrentTenant - Resolve the current tenant for the context of a class or method

  • @Tenant - Use a specific tenant for the context of a class or method

  • @WithoutTenant - Execute logic without a specific tenant (using the default connection)

For example:

import grails.gorm.multitenancy.*

// resolve the current tenant for every method
@CurrentTenant
class TeamService {

    // execute the countPlayers method without a tenant id
    @WithoutTenant
    int countPlayers() {
        Player.count()
    }

    // use the tenant id "another" for all GORM logic within the method
    @Tenant({"another"})
    List<Team> allTwoTeams() {
        Team.list()
    }

    List<Team> listTeams() {
        Team.list(max:10)
    }

    @Transactional
    void addTeam(String name) {
        new Team(name:name).save(flush:true)
    }
}

Multi Tenancy Modes

As mentioned previously, GORM for MongoDB supports all three multi tenancy modes however there are some considerations to keep in mind.

Database Per Tenant

When using the DATABASE mode, only GORM methods calls are dispatched to the correct tenant. This means the following will use the tenant id:

// switches to the correct client based on the tenant id
Book.list()

However, going directly through the MongoClient will not work:

@Autowired MongoClient mongoClient

// uses the default connection and doesn't resolve the tenant it
mongoClient.getDatabase("book").find()

If you are working directly with the MongoClient instance you need to make sure you obtain the correct instance. For example:

import grails.gorm.multitenancy.*

@Autowired MongoDatastore mongoDatastore
...
MongoClient mongoClient =
        mongoDatastore.getDatastoreForTenantId(Tenants.currentId())
                      .getMongoClient()

Schema Per Tenant

When using the SCHEMA mode, GORM for MongoDB will use a different MongoDB database, but the same MongoClient instance, for each tenant.

However, once again only GORM methods will use the correct database. For example:

// switches to the correct database based on the tenant id
Book.list()

However, getting the database directly from MongoClient will not work:

@Autowired MongoClient mongoClient

// uses the default connection and doesn't resolve the tenant it
mongoClient.getDatabase("book").find()

To resolve this you should always use the DB property of the class which will ensure the right database is used:

// switches to the correct database based on the tenant id
Book.DB.find()

Partitioned Multi-Tenancy

When using the DISCRIMINATOR approach, GORM for MongoDB will store a tenantId attribute in each MongoDB document and attempt to partition the data.

Once again this works only when using GORM methods and even then there are cases where it will not work if you use native MongoDB interfaces.

For example the following works fine:

// correctly includes the `tenantId` in the query
Book.list()

As does this:

import static com.mongodb.client.model.Filters.*;

// correctly includes the `tenantId` in the query
Book.find(eq("title", "The Stand")).first()

But this logic bypasses any built into tenant id interception and inclusion:

Book.collection.find().first()

Since you are operating directly on the collection GORM cannot know when you perform a query on said collection.

In this case you will have to ensure to include the tenantId manually:

import static com.mongodb.client.model.Filters.*;
...
Book.collection.find(eq("tenantId", Tenants.currentId())).first()

And the same is true of write operations such as inserts that are done with the native API.

Dynamic ConnectionSources

If you are using a multi-tenancy mode of DATABASE then by default the expectation is that all tenants are configured in your application.yml file.

However, it is possible read your Mongo client connection sources dynamically using MongoConnectionSources.

The MongoConnectionSources class will read the mongo client configurations from a Mongo collection called mongo.connections by default. To configure it you must specify the connectionSourcesClass in application.yml:

grails:
    mongodb:
        multiTenancy:
            mode: DATABASE
        connectionSourcesClass: org.grails.datastore.mapping.mongo.connections.MongoConnectionSources
        connectionsCollection: "myconnections"
...

You can then even add new connections at runtime using the ConnectionSources API:

import grails.gorm.multitenancy.*

@Autowired MongoDatastore mongoDatastore
...
def configuration = [url:"mongodb://localhost/moreBooks"]
MongoClient mongoClient =
        mongoDatastore.connectionSources
                                          .addConnectionSource("moreBooks", configuration)

All new connection sources will be stored within the specified connectionsCollection and if the application is restarted will read from the connectionsCollection.

GORM for MongoDB does not implement provisioning of new MongoDB instances at runtime. This is something that would need to be implemented by a cloud services provider for example.

Stateless Mode

GORM for MongoDB supports both stateless and stateful modes for mapping domain classes to MongoDB. In general stateful mapping is superior for write heavy applications and stateless mode better for read heavy applications (particularily when large amounts of data is involved).

Stateful mode

Domain classes are by default stateful, which means when they are read from a MongoDB document their state is stored in the user session (which is typically bound to the request in Grails). This has several advantages for write heavy applications:

  • GORM can automatically detect whether a call to save() is a an update or an insert and act appropriately

  • GORM stores the state of the read MongoDB document and therefore updates to schemaless properties don’t require an extra query

  • GORM can store the current version and therefore implement optimistic locking

  • Repeated reads of the same entity can be retrieved from the cache, thus optimizing reads as well

For an example of when a stateful domain class is better consider the following:

def b = Book.get(1)
b['pages'] = 400
b['publisher'] = 'Manning'
b['rating'] = 5
b.save(flush:true)

With a stateful entity the updates to the three properties can be batched up and executed in the save() call, when there is no state then 3 updates needs to be executed for each schemaless property (ouch!).

Stateless Domain classes

However, stateful domain classes can cause problems for read-heavy applications. Take for example the following code:

def books = Book.list() // read 100,000 books
for(b in books) {
    println b.title
}

The above example will read 100,000 books and print the title of each. In stateful mode this will almost certainly run out of memory as each MongoDB document is stored in user memory as is each book. Rewriting the code as follows will solve the problem:

Book.withStatelessSession {
    def books = Book.list() // read 100,000 books
    for(b in books) {
        println b.title
    }
}

Alternatively you can map the domain class as stateless, in which case its state will never be stored in the session:

class Book {
    ...
    static mapping = {
        stateless true
    }
}

Disadvantages of Stateless Mode

There are several disadvantages to using stateless domain classes as the default. One disadvantage is that if you are using assigned identifiers GORM cannot detect whether you want to do an insert or an update so you have to be explicit about which one you want:

Book b = new Book()
b.id = "The Book"
b.insert()

In the above case we use the explicit insert method to tell Grails this is an insert not an udpate. Another disadvantage is that reading of schemaless/dynamic properties is more costly. For example:

def books = Book.list() // read 100,000 books
for(b in books) {
    println b['pages']
    println b['rating']
}

Here GORM has to execute an additional read method for each schemaless property! This is better written as:

def books = Book.list() // read 100,000 books
for(b in books) {
    def dbo = b.dbo
    println dbo['pages']
    println dbo['rating']
}

Thus only requiring one query. Or alternatively you can use the native API:

def books = Book.collection.find() // read 100,000 books
for(dbo in books) {
    Book b = dbo as Book
    println dbo['pages']
    println dbo['rating']
}

Which would be more efficient.

Using the MongoDB Driver Directly

A lower level API is provided by the plugin via the MongoDB driver

There is an excellent tutorial on how to use the MongoDB Java driver’s API directly in the MongoDB documentation

An example can be seen below:

// Get a db reference in the old fashion way
def db = mongo.getDatabase("mydb")

// Insert a document
db.languages.insert([name: 'Groovy'])
// A less verbose way to do it
db.languages.insert(name: 'Ruby')
// Yet another way
db.languages << [name: 'Python']

// Insert a list of documents
db.languages << [[name: 'Javascript', type: 'prototyped'], [name: 'Ioke', type: 'prototyped']]

To get hold of the mongo instance (which is an instance of the com.mongodb.Mongo class) inside a controller or service simple define a mongo property:

def mongo
def myAction = {
    def db = mongo.getDatabase("mongo")
    db.languages.insert([name: 'Groovy'])
}

A request scoped bean is also available for the default database (typically the name of your application, unless specified by the databaseName config option, plus the suffix "DB").

def peopleDB
def myAction = {
    peopleDB.languages.insert([name: 'Fred'])
}

Each domain class you define also has a collection property that allows easy access to the underlying Collection instance:

Person.collection.count() == 1
Person.collection.findOne(firstName:"Fred").lastName == "Flintstone"

You can easily convert from a native MongoDB Document into an entity using a cast:

def fred = Person.collection.findOne(firstName:"Fred") as Person

Transactions

MongoDB doesn’t support transactions directly, however GORM for MongoDB does batch up inserts and updates until the session is flushed. This makes it possible to support some rollback options.

You can use either transactional services or the static withTransaction method. To mark a service as using the MongoDB transaction manager, use the static transactional property with the value 'mongo':

static transactional = 'mongo'

Alternately you can do ad-hoc transactions using the withTransaction method:

Person.withTransaction { status ->
    new Person(name:"Bob", age:50).save()
    throw new RuntimeException("bad")
    new Person(name:"Fred", age:45).save()
}

For example in this case neither Person object will be persisted to the database, because underneath the surface a persistence session is being used to batch up both insert operations into a single insert. When an exception is thrown neither insert is ever executed, hence we allow for some transactional semantics at the GORM-level.

Using the lower level API you can of course also take advantage of Mongo’s support for Atomic operations.

Unit Testing

To write unit tests with MongoDB and Spock you can simply extend from grails.test.mongodb.MongoSpec.

MongoSpec is an abstract class that will initialise GORM in the setup phase of the specification being executed. It uses by default a MongoClient instance that connects to a MongoDB instance as defined in your configuration (by default, localhost and port 27017, see Getting Started for more details):

It is preferable to use testcontainers to automatically run MongoDB in a containerized environment and not have to run a MongoDB instance locally. The following examples use testcontainers:

package functional.tests

import com.mongodb.client.MongoClient
import com.mongodb.client.MongoClients
import groovy.transform.CompileStatic
import org.testcontainers.containers.MongoDBContainer

@CompileStatic
trait EmbeddedMongoClient {

    abstract MongoDBContainer getMongoDBContainer()

    MongoClient createMongoClient() {
        if (!mongoDBContainer.isRunning()) {
            mongoDBContainer.start()
        }
        return MongoClients.create(mongoDBContainer.getReplicaSetUrl())
    }
}
import grails.test.mongodb.MongoSpec
import grails.validation.ValidationException
import org.testcontainers.containers.MongoDBContainer
import org.testcontainers.utility.DockerImageName
import spock.lang.Ignore
import spock.lang.Shared

class LocalMongoUnitSpec extends MongoSpec implements EmbeddedMongoClient {

    @Shared
    final MongoDBContainer mongoDBContainer = new MongoDBContainer(DockerImageName.parse("mongo:latest"))

    @Ignore
    void "test fail on error"() {

        when:
        def invalid = new Book(title: "")
        invalid.save()

        then:
        thrown ValidationException
        invalid.hasErrors()
    }
}

You can also use your own low-level MongoClient instance, as shown in the following example:

package functional.tests

import grails.test.mongodb.MongoSpec
import org.testcontainers.containers.MongoDBContainer
import org.testcontainers.utility.DockerImageName
import spock.lang.Shared

class BookUnitSpec extends MongoSpec implements EmbeddedMongoClient {

    @Shared
    final MongoDBContainer mongoDBContainer = new MongoDBContainer(DockerImageName.parse("mongo:latest"))

    void "Test low-level API extensions"() {
        when:
        def db = createMongoClient().getDatabase("test")
//        db.drop()
        // Insert a document
        db['languages'].insert([name: 'Groovy'])
        // A less verbose way to do it
        db.languages.insert(name: 'Ruby')
        // Yet another way
        db.languages << [name: 'Python']

        then:
        db.languages.count() == 3
    }

    void "Test GORM access"(){
        when:
        Book book = new Book(title: 'El Quijote').save(flush: true)

        then:
        Book.count() ==1

        when:
        book = Book.findByTitle('El Quijote')

        then:
        book.id
    }

}

Note that the default implementation is to scan your classpath searching for domain classes, from the package defined in the configuration property grails.codegen.defaultPackage, and all the way down its subpackages. If your application is large, classpath scanning may be slow, so it’s better to override the method getDomainClasses():

@Override
protected List<Class> getDomainClasses() {
    [Book]
}

Integration Testing

There is a plugin available that will execute an in memory Mongo database during your integration tests. Data will be cleared between test cases so they can work similarly to H2 with @Rollback.

Visit the github page of the Embedded MongoDB Grails Plugin to learn more.

Reference

Beans

mongo

Purpose

Provides access to the native MongoClient instance.

Examples
MongoClient mongo

class FooController {
    MongoClient mongo
    def myAction() {
        MongoDatabase db = mongo.getDatabase("mongo")
        db.languages.insert([name: 'Groovy'])
    }
}
Description

See the API for the Mongo Java Driver for API usage info.

Domain Classes

collection

Purpose

Returns the MongoDB collection used for the current domain class

Examples
def bookBson = Book.collection.find().first()
Description

The collection property allows access to the underlying MongoDB MongoCollection object, thus allowing direct access to the low-level MongoDB driver.

collectionName

Purpose

Returns the name of the MongoDB collection used for the current domain class

Examples
println Book.collectionName
Description

The collectionName property allows introspection of the name of the DBCollection object used by a given domain class. Can be used in conjunction with useCollection to switch to different collections and back again.

countHits

Purpose

Executes a MongoDB $text search query and returns the number of hits.

Examples
assert Product.countHits("coffee") == 5
Description

The countHits method uses MongoDB’s full text search support to perform full text search on a "text" index and return the size of the returned cursor.

DB

Purpose

Returns the MongoDB MongoDatabase object.

Examples
MongoCollection dbCollection = Book.DB.getCollection("books")
Description

The DB property allows access to the underlying MongoDB MongoDatabase object, thus allowing easy access to the low-level MongoDB Java driver.

dbo

Purpose

Returns the MongoDB Document for an instance of a domain class

Using the Document object directly is discouraged, because it’s inefficient. It’s better to use Dynamic Attributes.
Examples
def b = Book.get(1)

println b.dbo
Description

The dbo property allows access to the underlying MongoDB Document, which is a respresentation of the stored BSON document that can be manipulated in memory.

findByGeoIntersects

Purpose

Executes a MongoDB $geoIntersects query

Examples

Given:

import grails.mongodb.geo.*
...
class Entry {
    ObjectId id
    Shape shape

    static mapping = {
        shape geoIndex:'2dsphere'
    }
}
...
new Entry(shape: Polygon.valueOf([[[3, 1], [1, 2], [5, 6], [9, 2], [4, 3], [3, 1]]]) ).save()
new Entry(shape: LineString.valueOf([[5, 2], [7, 3], [7, 5], [9, 4]]) ).save()
new Entry(shape: Point.valueOf([5, 2])).save()

And then use the findBy*GeoIntersects method to figure out whether shapes intersect with each other:

assert Entry.findByShapeGeoIntersects( Polygon.valueOf( [[ [0,0], [3,0], [3,3], [0,3], [0,0] ]] ) )
assert Entry.findByShapeGeoIntersects( LineString.valueOf( [[1,4], [8,4]] ) )
// native query
assert Entry.findByShapeGeoIntersects( [ $geometry : [type: "Polygon" ,
                                                      coordinates: [ [ [ 0 , 0 ] , [ 3 , 6 ] , [ 6 , 1 ] , [ 0 , 0 ] ] ]
                                                      ]
                                       ])
Description

The $geoIntersects operator is a geospatial query operator that selects all locations that intersect with a GeoJSON object. See $geoIntersects.

findByGeoWithin

Purpose

Executes a MongoDB $geoWithin query

Examples
Restaurant.findByLocationGeoWithin( Polygon.valueOf([ [0, 0], [100, 0], [100, 100], [0, 100], [0, 0] ]) )
Restaurant.findByLocationGeoWithin( Box.valueOf( [[25, 25], [100, 100]] ) )
Restaurant.findByLocationGeoWithin( Circle.valueOf( [[50, 50], 100] ) )
Restaurant.findByLocationGeoWithin( Sphere.valueOf( [[50, 50], 0.06]) )
// native query
Restaurant.findByPointGeoWithin([ '$polygon': [ [0.0d, 0.0d], [3.0d, 0.0d], [3.0d, 3.0d], [0.0d, 3.0d], [0.0d, 0.0d] ] ])
Description

The $geoWithin operator is a geospatial query operator that queries for a defined point, line or shape that exists entirely within another defined shape. When determining inclusion, MongoDB considers the border of a shape to be part of the shape, subject to the precision of floating point numbers. See $geoWithin for more information.

findByNear

Purpose

Executes a MongoDB $near query

Examples
import grails.mongodb.geo.*
...
Restaurant.findByLocationNear( Point.valueOf( 40, 40 ) )
// native query
Restaurant.findAllByLocationNear( [$geometry: [type:'Point', coordinates: [1,7]], $maxDistance:30000] )
// criteria query
Restaurant.withCriteria {
    near 'location', Point.valueOf(1,7), 300000
}
Description

Specifies a point for which a geospatial query returns the closest documents first. The query sorts the documents from nearest to farthest. See $near documentation for more info.

findByNearSphere

Purpose

Executes a MongoDB $nearSphere query

Examples
import grails.mongodb.geo.*
...
Restaurant.findByLocationNearSphere( Point.valueOf( 40, 40 ) )
// native query
Restaurant.findAllByLocationNearSphere( [$geometry: [type:'Point', coordinates: [1,7]], $maxDistance:30000] )
// criteria query
Restaurant.withCriteria {
    nearSphere 'location', Point.valueOf(1,7), 300000
}
Description

Specifies a point for which a geospatial query returns the closest documents first. The query sorts the documents from nearest to farthest. MongoDB calculates distances for $nearSphere using spherical geometry.

See the documentation for the $nearSphere query operator.

findByWithinBox

Purpose

Executes a MongoDB $within query on legacy coordinate pairs

The $within operator is considered legacy and replaced by $geoWithin. Hence this method is deprecated and findByGeoWithin should be used instead
Examples
Hotel.findByLocationWithinBox( [[40, 30],[60, 70]] )
Hotel.findByLocationWithinBox( Box.valueOf([[40, 30],[60, 70]]) )

findByWithinCircle

Purpose

Executes a MongoDB $within query on legacy coordinate pairs

The $within operator is considered legacy and replaced by $geoWithin. Hence this method is deprecated and findByGeoWithin should be used instead
Examples
Hotel.findByLocationWithinCircle([[40, 30],40])
Hotel.findByLocationWithinCircle( Circle.valueOf( [[40, 30],40] ) )
Purpose

Executes a MongoDB $text search query

Examples
assert Product.search("coffee").size() == 5
assert Product.search("bake coffee cake").size() == 10
assert Product.search("bake coffee -cake").size() == 6
assert Product.search('"Coffee Cake"').size() == 1

assert Product.search('tarta', [language:'es', offset:5, max:10])
Description

The search method uses MongoDB’s full text search support to perform full text search on a "text" index.

searchTop

Purpose

Executes a MongoDB $text search query

Examples
assert Product.searchTop("coffee").size() == 5
assert Product.searchTop("coffee", 3)
Description

The searchTop method uses MongoDB’s full text search support to perform full text search on a "text" index with the results sorted by the MongoDB score. The method by default returns the top 5 results, but the second argument can be used to customize the number of results (top 3, top 10 etc.)

useCollection

Purpose

Allows switching which collection to use to persist for the domain class for the scope of the current session (connection).

Examples
Book.useCollection("non-fiction")
Description

The useCollection method allows switching, at runtime, the collection used persist and retrieve domain classes. The collectionName property will return the current collection being used. Note that the method switches the collection used for the scope of the current session/connection (ie. it is not permanent). If you wish to permanently change the collection used then you need to configure the mapping of the domain class.

useDatabase

Purpose

Allows switching which database to use to persist for the domain class for the scope of the current session (connection).

Examples
Book.useDatabase("non-fiction")
Description

The useDatabase method allows switching, at runtime, the database used persist and retrieve domain classes. The DB property will return the current database being used. Note that the method switches the database used for the scope of the current session/connection (ie. it is not permanent). If you wish to permanently change the database used then you need to configure the mapping of the domain class.

withCollection

Purpose

Allows switching which collection to use to persist for the domain class for the scope of the given closure

Examples
Book.withCollection("non-fiction") {
    // code here
}
Description

The useCollection method allows switching, at runtime, the collection used persist and retrieve domain classes. The collectionName property will return the current collection being used. Note that the method switches the collection used for the scope of given closure (ie. it is not permanent). If you wish to permanently change the collection used then you need to configure the mapping of the domain class.

withDatabase

Purpose

Allows switching which database to use to persist for the domain class for the scope of the given closure.

Examples
Book.withDatabase("non-fiction") {
    // code here
}
Description

The withDatabase method allows switching, at runtime, the database used persist and retrieve domain classes. The DB property will return the current database being used. Note that the method switches the database used for the scope of the given closure (ie. it is not permanent). If you wish to permanently change the database used then you need to configure the mapping of the domain class.