cancel
Showing results for 
Search instead for 
Did you mean: 

Head's Up! These forums are read-only. All users and content have migrated. Please join us at community.neo4j.com.

Using multiple datasources with Apollo and Neo4j

malik
Node Clone

Hello community,

I'm currently trying to extend my setup to be able to use Apollo GraphQL server (lambda version) with an additional datastore (DynamoDB in this particular case). I stumbled upon the following mentions for writing custom resolvers for additional datasets:

and

However, trying to implement any of these gives me no decent results. I don't understand what I'm doing wrong here.

Let me quickly set the context:
I only have one typeDefs file where I define all Neo4j and non-Neo4j types, queries and mutations.

Here's a snippet of my schema that I'd like to write a custom resolver for:

type PodcastBasicStatisticsSubmission {
  podcastId: String
}

So, in my dynamodb resolver file (just for the sake of a test) I do:

const resolvers = {
  PodcastBasicStatisticsSubmission: {
    podcastId: (obj, params, ctx, resolveInfo) => {
      return 'test';
    }
  }
};

export default resolvers;

Then in my main Apollo configuration file I do:

import dynamodbResolvers from './databases/dynamodb/dynamodb';

const resolvers = dynamodbResolvers;
const schema = makeAugmentedSchema({
  typeDefs,
  resolvers,
  config: {
    query: {
      exclude: ['PodcastBasicStatisticsSubmission']
    },
    mutation: {
      exclude: ['PodcastBasicStatisticsSubmission']
    }
  }
});

I do NOT include resolvers in my ApolloServer configuration as demonstrated by @William_Lyon here
Example by William Lyon

In my Apollo Playground I do see my type definition but, no queries or mutations, obviously.
I can't query my type as I normally could with everything that's currently inside my Neo4j instance and for what resolvers have been created by makeAugmentedSchema.

Clearly, I'm missing some fundamental stuff here but what?
Any clues?

1 ACCEPTED SOLUTION

MuddyBootsCode
Graph Steward

I think what you're trying to do is go ahead and merge the two schemas together. You'll have the separate types and resolvers created by the GRANDstack and then the other types and resolvers you're making for dynamoDB. I've got a sort of similar situation where I stitch in a REST end point into my GraphQL application. Code is as follows:

const augmentedSchema = makeAugmentedSchema({
  typeDefs,
  resolvers: {
    JSON: GraphQLJSON,
    Query: {
      Users: (parent, args, context, info) => Auth0Auth.getAllUsers(),
    },
    Mutation: {
      setUserRole: (parent, { userID, roles }, context, info) => Auth0Auth.setUserAppMetadata(userID, roles),
      CreateUser: (parent, { email, password }, context, info) => Auth0Auth.createUser({ email, password }),
      DeleteUser: (parent, { userID }, context, info) => Auth0Auth.deleteUser(userID),
    },
  },
});

const diSchema = makeExecutableSchema({
  typeDefs: diTypeDefs,
  resolvers: diResolvers,
});

const withRestSchema = mergeSchemas({
  schemas: [augmentedSchema, diSchema],
  schemaDirectives: {
    isAuthenticated: IsAuthenticatedDirective,
    hasRole: HasRoleDirective,
  },
});

What I had to do was bring in the other side with MakeExecutableSchema and then merge them with MergeSchemas. Then build my server like so:

const server = new ApolloServer({
  context: ({ req }) => {
    return {
      headers: req.headers,
      driver,
    };
  },
  schema: withRestSchema,
  // Uncomment these lines for introspection
  introspection: true,
  // playground: true,
});

I think this is what you're trying to do. I hope it helps.

View solution in original post

2 REPLIES 2

MuddyBootsCode
Graph Steward

I think what you're trying to do is go ahead and merge the two schemas together. You'll have the separate types and resolvers created by the GRANDstack and then the other types and resolvers you're making for dynamoDB. I've got a sort of similar situation where I stitch in a REST end point into my GraphQL application. Code is as follows:

const augmentedSchema = makeAugmentedSchema({
  typeDefs,
  resolvers: {
    JSON: GraphQLJSON,
    Query: {
      Users: (parent, args, context, info) => Auth0Auth.getAllUsers(),
    },
    Mutation: {
      setUserRole: (parent, { userID, roles }, context, info) => Auth0Auth.setUserAppMetadata(userID, roles),
      CreateUser: (parent, { email, password }, context, info) => Auth0Auth.createUser({ email, password }),
      DeleteUser: (parent, { userID }, context, info) => Auth0Auth.deleteUser(userID),
    },
  },
});

const diSchema = makeExecutableSchema({
  typeDefs: diTypeDefs,
  resolvers: diResolvers,
});

const withRestSchema = mergeSchemas({
  schemas: [augmentedSchema, diSchema],
  schemaDirectives: {
    isAuthenticated: IsAuthenticatedDirective,
    hasRole: HasRoleDirective,
  },
});

What I had to do was bring in the other side with MakeExecutableSchema and then merge them with MergeSchemas. Then build my server like so:

const server = new ApolloServer({
  context: ({ req }) => {
    return {
      headers: req.headers,
      driver,
    };
  },
  schema: withRestSchema,
  // Uncomment these lines for introspection
  introspection: true,
  // playground: true,
});

I think this is what you're trying to do. I hope it helps.

malik
Node Clone

@MuddyBootsCode is a honourable community member for a reason! Lovely. Thank you.