GraphQL::FragmentCache powers up graphql-ruby with the ability to cache response fragments: you can mark any field as cached and it will never be resolved again (at least, while cache is valid). For instance, the following code caches title for each post:
class PostType < BaseObject
field :id, ID, null: false
field :title, String, null: false, cache_fragment: true
endYou can support my open–source work here.
Add the gem to your Gemfile gem 'graphql-fragment_cache' and add the plugin to your schema class:
class GraphqSchema < GraphQL::Schema
use GraphQL::FragmentCache
query QueryType
endInclude GraphQL::FragmentCache::Object to your base type class:
class BaseType < GraphQL::Schema::Object
include GraphQL::FragmentCache::Object
endIf you're using resolvers — include the module into the base resolver as well:
class Resolvers::BaseResolver < GraphQL::Schema::Resolver
include GraphQL::FragmentCache::ObjectHelpers
endNow you can add cache_fragment: option to your fields to turn caching on:
class PostType < BaseObject
field :id, ID, null: false
field :title, String, null: false, cache_fragment: true
endAlternatively, you can use cache_fragment method inside resolver methods:
class QueryType < BaseObject
field :post, PostType, null: true do
argument :id, ID, required: true
end
def post(id:)
cache_fragment { Post.find(id) }
end
endCache keys consist of the following parts: namespace, implicit key, and explicit key.
The namespace is prefixed to every cached key. The default namespace is graphql, which is configurable:
GraphQL::FragmentCache.namespace = "graphql"Implicit part of a cache key contains the information about the schema and the current query. It includes:
- Hex gsdigest of the schema definition (to make sure cache is cleared when the schema changes).
- The current query fingerprint consisting of a path to the field, arguments information and the selections set.
Let's take a look at the example:
query = <<~GQL
query {
post(id: 1) {
id
title
cachedAuthor {
id
name
}
}
}
GQL
schema_cache_key = GraphqSchema.schema_cache_key
path_cache_key = "post(id:1)/cachedAuthor"
selections_cache_key = "[#{%w[id name].join(".")}]"
query_cache_key = Digest::SHA1.hexdigest("#{path_cache_key}#{selections_cache_key}")
cache_key = "#{schema_cache_key}/#{query_cache_key}/#{object_cache_key}"You can override schema_cache_key, query_cache_key, path_cache_key or object_cache_key by passing parameters to the cache_fragment calls:
class QueryType < BaseObject
field :post, PostType, null: true do
argument :id, ID, required: true
end
def post(id:)
cache_fragment(query_cache_key: "post(#{id})") { Post.find(id) }
end
endOverriding path_cache_key might be helpful when you resolve the same object nested in multiple places (e.g., Post and Comment both have author), but want to make sure cache will be invalidated when selection set is different.
Same for the option:
class PostType < BaseObject
field :id, ID, null: false
field :title, String, null: false, cache_fragment: {query_cache_key: "post_title"}
endOverriding object_cache_key is helpful in the case where the value that is cached is different than the one used as a key, like a database query that is pre-processed before caching.
class QueryType < BaseObject
field :post, PostType, null: true do
argument :id, ID, required: true
end
def post(id:)
query = Post.where("updated_at < ?", Time.now - 1.day)
cache_fragment(object_cache_key: query.cache_key) { query.some_process }
end
endYou can influence the way that graphql arguments are include in the cache key.
A use case might be a :renew_cache parameter that can be used to force a cache rewrite,
but should not be included with the cache key itself. Use cache_key: { exclude_arguments: […]}
to specify a list of arguments to be excluded from the implicit cache key.
class QueryType < BaseObject
field :post, PostType, null: true do
argument :id, ID, required: true
argument :renew_cache, Boolean, required: false
end
def post(id:, renew_cache: false)
if renew_cache
context.scoped_set!(:renew_cache, true)
end
cache_fragment(cache_key: {exclude_arguments: [:renew_cache]}) { Post.find(id) }
end
endLikewise, you can use cache_key: { include_arguments: […] } to specify an allowlist of arguments
to be included in the cache key. In this case all arguments for the cache key must be specified, including
parent arguments of nested fields.
In most cases you want your cache key to depend on the resolved object (say, ActiveRecord model). You can do that by passing an argument to the #cache_fragment method in a similar way to Rails views #cache method:
def post(id:)
post = Post.find(id)
cache_fragment(post) { post }
endYou can pass arrays as well to build a compound cache key:
def post(id:)
post = Post.find(id)
cache_fragment([post, current_account]) { post }
endYou can omit the block if its return value is the same as the cached object:
# the following line
cache_fragment(post)
# is the same as
cache_fragment(post) { post }Using literals: Even when using a same string for all queries, the cache changes per argument and per selection set (because of the query_key).
def post(id:)
cache_fragment("find_post") { Post.find(id) }
endCombining with options:
def post(id:)
cache_fragment("find_post", expires_in: 5.minutes) { Post.find(id) }
endDynamic cache key:
def post(id:)
last_updated_at = Post.select(:updated_at).find_by(id: id)&.updated_at
cache_fragment(last_updated_at, expires_in: 5.minutes) { Post.find(id) }
endNote the usage of .select(:updated_at) at the cache key field to make this verifying query as fastest and light as possible.
You can also add touch options for the belongs_to association e.g author's belongs_to: :post to have a touch: true.
So that it invalidates the Post when the author is updated.
When using cache_fragment: option, it's only possible to use the resolved value as a cache key by setting:
field :post, PostType, null: true, cache_fragment: {cache_key: :object} do
argument :id, ID, required: true
end
# this is equal to
def post(id:)
cache_fragment(Post.find(id))
endAlso, you can pass :value to the cache_key: argument to use the returned value to build a key:
field :post, PostType, null: true, cache_fragment: {cache_key: :value} do
argument :id, ID, required: true
end
# this is equal to
def post(id:)
post = Post.find(id)
cache_fragment(post) { post }
endIf you need more control, you can set cache_key: to any custom code:
field :posts,
Types::Objects::PostType.connection_type,
cache_fragment: {cache_key: -> { object.posts.maximum(:created_at) }}The way cache key part is generated for the passed argument is the following:
- Use
object_cache_key: "some_cache_key"if passed tocache_fragment - Use
#graphql_cache_keyif implemented. - Use
#cache_key(or#cache_key_with_versionfor modern Rails) if implemented. - Use
self.to_sfor primitive types (strings, symbols, numbers, booleans). - Raise
ArgumentErrorif none of the above.
By default, we do not take context into account when calculating cache keys. That's because caching is more efficient when it's context-free.
However, if you want some fields to be cached per context, you can do that either by passing context objects directly to the #cache_fragment method (see above) or by adding a context_key option to cache_fragment:.
For instance, imagine a query that allows the current user's social profiles:
query {
socialProfiles {
provider
id
}
}You can cache the result using the context (context[:user]) as a cache key:
class QueryType < BaseObject
field :social_profiles, [SocialProfileType], null: false, cache_fragment: {context_key: :user}
def social_profiles
context[:user].social_profiles
end
endThis is equal to using #cache_fragment the following way:
class QueryType < BaseObject
field :social_profiles, [SocialProfileType], null: false
def social_profiles
cache_fragment(context[:user]) { context[:user].social_profiles }
end
endUse the if: (or unless:) option:
def post(id:)
cache_fragment(if: current_user.nil?) { Post.find(id) }
end
# or
field :post, PostType, cache_fragment: {if: -> { current_user.nil? }} do
argument :id, ID, required: true
end
# or
field :post, PostType, cache_fragment: {if: :current_user?} do
argument :id, ID, required: true
endYou can configure default options that will be passed to all cache_fragment
calls and cache_fragment: configurations. For example:
GraphQL::FragmentCache.configure do |config|
config.default_options = {
expires_in: 1.hour, # Expire cache keys after 1 hour
schema_cache_key: nil # Do not clear the cache on each schema change
}
endYou can force the cache to renew during query execution by adding
renew_cache: true to the query context:
MyAppSchema.execute("query { posts { title } }", context: {renew_cache: true})This will treat any cached value as missing even if it's present, and store fresh new computed values in the cache. This can be useful for cache warmers.
It's up to your to decide which caching engine to use, all you need is to configure the cache store:
GraphQL::FragmentCache.configure do |config|
config.cache_store = MyCacheStore.new
endOr, in Rails:
# config/application.rb (or config/environments/<environment>.rb)
Rails.application.configure do |config|
# arguments and options are the same as for `config.cache_store`
config.graphql_fragment_cache.store = :redis_cache_store
end#read(key), #exist?(key) and #write_multi(hash, **options) or #write(key, value, **options) methods.
The gem provides only in-memory store out-of-the-box (GraphQL::FragmentCache::MemoryStore). It's used by default.
You can pass store-specific options to #cache_fragment or cache_fragment:. For example, to set expiration (assuming the store's #write method supports expires_in option):
class PostType < BaseObject
field :id, ID, null: false
field :title, String, null: false, cache_fragment: {expires_in: 5.minutes}
end
class QueryType < BaseObject
field :post, PostType, null: true do
argument :id, ID, required: true
end
def post(id:)
cache_fragment(expires_in: 5.minutes) { Post.find(id) }
end
endIf you are using Dataloader, you will need to let the gem know using dataloader: true:
class PostType < BaseObject
field :author, User, null: false
def author
cache_fragment(dataloader: true) do
dataloader.with(AuthorDataloaderSource).load(object.id)
end
end
end
# or
class PostType < BaseObject
field :author, User, null: false, cache_fragment: {dataloader: true}
def author
dataloader.with(AuthorDataloaderSource).load(object.id)
end
endThe problem is that I didn't find a way to detect that dataloader (and, therefore, Fiber) is used, and the block is forced to resolve, causing the N+1 inside the Dataloader Source class.
If you want to call #cache_fragment from places other that fields or resolvers, you'll need to pass context explicitly and turn on raw_value support. For instance, let's take a look at this extension:
class Types::QueryType < Types::BaseObject
class CurrentMomentExtension < GraphQL::Schema::FieldExtension
# turning on cache_fragment support
include GraphQL::FragmentCache::ObjectHelpers
def resolve(object:, arguments:, context:)
# context is passed explicitly
cache_fragment(context: context) do
result = yield(object, arguments)
"#{result} (at #{Time.now})"
end
end
end
field :event, String, null: false, extensions: [CurrentMomentExtension]
def event
"something happened"
end
endWith this approach you can use #cache_fragment in any place you have an access to the context. When context is not available, the error cannot find context, please pass it explicitly will be thrown.
If you have a fragment that accessed from multiple times (e.g., if you have a list of items that belong to the same owner, and owner is cached), you can avoid multiple cache reads by using :keep_in_context option:
class QueryType < BaseObject
field :post, PostType, null: true do
argument :id, ID, required: true
end
def post(id:)
cache_fragment(keep_in_context: true, expires_in: 5.minutes) { Post.find(id) }
end
endThis can reduce a number of cache calls but increase memory usage, because the value returned from cache will be kept in the GraphQL context until the query is fully resolved.
Sometimes errors happen during query resolving and it might make sense to skip caching for such queries (for instance, imagine a situation when client has no access to the requested field and the backend returns { data: {}, errors: ["you need a permission to fetch orders"] }). This is how this behavior can be turned on (it's off by default!):
GraphQL::FragmentCache.skip_cache_when_query_has_errors = trueAs a result, caching will be skipped when errors array is not empty.
Cache processing can be disabled if needed. For example:
GraphQL::FragmentCache.enabled = false if Rails.env.test?It may be useful to capture cache lookup events. When monitoring is enabled, the cache_key, operation_name, path and a boolean indicating a cache hit or miss will be sent to a cache_lookup_event method. This method can be implemented in your application to handle the event.
Example handler defined in a Rails initializer:
module GraphQL
module FragmentCache
class Fragment
def self.cache_lookup_event(**args)
# Monitoring such as incrementing a cache hit counter metric
end
end
end
endLike managing caching itself, monitoring can be enabled if needed. It is disabled by default. For example:
GraphQL::FragmentCache.monitoring_enabled = trueSchema#execute, graphql-batch and graphql-ruby-fragment_cache do not play well together. The problem appears whencache_fragmentis inside the.thenblock:
def cached_author_inside_batch
AuthorLoader.load(object).then do |author|
cache_fragment(author, context: context)
end
endThe problem is that context is not properly populated inside the block (the gem uses :current_path to build the cache key). There are two possible workarounds: use dataloaders or manage :current_path manually:
def cached_author_inside_batch
outer_path = context.namespace(:interpreter)[:current_path]
AuthorLoader.load(object).then do |author|
context.namespace(:interpreter)[:current_path] = outer_path
cache_fragment(author, context: context)
end
end- Caching does not work for Union types, because of the
Lookaheadimplementation: it requires the exact type to be passed to theselectionmethod (you can find the discussion here). This method is used for cache key building, and I haven't found a workaround yet (PR in progress). If you getFailed to look ahead the fielderror — please passpath_cache_keyexplicitly:
field :cached_avatar_url, String, null: false
def cached_avatar_url
cache_fragment(path_cache_key: "post_avatar_url(#{object.id})") { object.avatar_url }
endBased on the original gist by @palkan and @ssnickolay.
Initially sponsored by Evil Martians.
Bug reports and pull requests are welcome on GitHub at https://github.com/DmitryTsepelev/graphql-ruby-fragment_cache.
The gem is available as open source under the terms of the MIT License.