Using JSON to Model Complex OAuth Scopes
- 4 mins(also posted here).
Summary: Let’s use JSON scopes instead of encoding scope data structures into plain scope strings based on arbitrary grammars.
In most implementations of OAuth 2.0, scopes are plain strings. As more complex applications use OAuth 2.0, there is an increasing need for more complex scopes. For example, in healthcare applications based on HL7 FHIR, the SMART profile suggests scopes such as: patient/Observation.read
which is interpreted as the permission to read
resources of type Observation
belonging to the current patient. In this example, there are two micro-scopes, each of a different type:
Observation
which is the resource type, andread
which is the access mode.
We cannot grant these as two separate scopes since we need to also reflect the fact that the read
operation is granted only on resources of type Observation
, i.e. the two micro-scopes are tied together. In other words, what we want to grant is a tuple containing two micro-scopes. The SMART profile proposes a custom grammar to encodes such a tuple into a plain string.
When there are other types of micro-scopes (e.g. purpose of use or confidentiality labels) these tuples are going to get larger and the grammar will become more complicated. For example, to grant read
operation on Observation
resources of Normal
confidentiality and only for the purpose of Treatment
, we need to expand this grammar to accommodate all these micro-scopes (e.g. see this post by John Moehrke).
A simpler and more reasonable solution is to encode the scopes in JSON. For example, the following JSON scope grants read
access to Observation
and Immunization
resources.
{
"resourceType":["Observation", "Immunization"],
"accessType":["read"]
}
The Permission
structure in User-Managed Access (UMA) seems to be a preliminary step in this direction; UMA permissions are basically a simple JSON structure replacing plain OAuth scopes.
In the rest of this post, I will discuss some of the reasons why this is a better solution.
Custom Parsing, Encoding, and Other Logic
With custom grammars, the developers need to understand the grammar and develop their own code for a scope parser, encoder, and other related functions, such as testing whether a requested access is covered by granted scopes. This adds to the cognitive load and can be costly and error-prone. Moreover, as the specs evolve, the grammar may have to change. This means more work for developers and also issues like extensions and backward compatibility. JSON, on the other hand, is already a familiar data structure to developers, is supported in most modern programming languages and platforms, and is flexible and straightforward with extensions.
Modeling Complex Values
Some micro-scopes are not necessarily atomic strings. For example, many of the FHIR value sets (e.g. purpose of use) are concept descriptors composed of at least a system ID and a code. Flattening these structures makes the grammar even more complicated and less readable. Modeling concept descriptors in JSON is straightforward as shown in the following example which includes a purpose of use micro-scope:
{
"resourceType":["Observation"],
"accessType":["read"],
"purposeOfUse": [
{
"system": "http://hl7.org/fhir/ValueSet/v3-PurposeOfUse",
"code": "TREAT"
}
]
}
Modeling Lists
What if we want to grant read
and write
access to Observation
resources? If the custom grammar does not allow multiple values for a type of micro-scopes, we will have to create two scopes: patient/Observation.read
and patient/Observation.write
. Aside from the redundancy (repeating Observation
) the custom grammar has somehow obscured the simple fact we are granting two access modes to the same resource type. Because of the custom grammar, developers have to implement custom logic to compare scopes and check things like whether a requested access mode is granted by a given set of scopes. Note that SMART grammar allows wild cards to alleviates this problem (e.g. patient/Observation.*
) but in the case of micro-scopes with a larger value set (e.g. purpose of use), the wild card does not solve this problem. In JSON, this can simply be modeled as an array with a straightforward logic which is already supported by existing libraries.