Skip to content

Commit

Permalink
enhance: normalize/denormalize interface updates (#3134)
Browse files Browse the repository at this point in the history
* enhance: Change Schema.normalize interface
* enhance: Update denormalize() args order
* enhance: Reorder normalize arguments
* enhance: Make argsKey optional in memo.query and memo.buildQueryKey
* enhance: Extract EntitySchema type helpers into own file
* enhance: Destructure store in normalize()
  • Loading branch information
ntucker committed Jul 8, 2024
1 parent 3c95531 commit 2ad1811
Show file tree
Hide file tree
Showing 95 changed files with 1,830 additions and 1,805 deletions.
45 changes: 45 additions & 0 deletions .changeset/hungry-spies-design.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,45 @@
---
'@data-client/normalizr': minor
'@data-client/endpoint': minor
'@data-client/graphql': minor
'@data-client/rest': minor
'@data-client/react': patch
'@data-client/core': patch
---

Change Schema.normalize `visit()` interface; removing non-contextual arguments.

```ts
/** Visits next data + schema while recurisvely normalizing */
export interface Visit {
(schema: any, value: any, parent: any, key: any, args: readonly any[]): any;
creating?: boolean;
}
```

This results in a 10% normalize performance boost.

```ts title="Before"
processedEntity[key] = visit(
processedEntity[key],
processedEntity,
key,
this.schema[key],
addEntity,
visitedEntities,
storeEntities,
args,
);
```

```ts title="After"
processedEntity[key] = visit(
this.schema[key],
processedEntity[key],
processedEntity,
key,
args,
);
```

The information needed from these arguments are provided by [closing](https://en.wikipedia.org/wiki/Closure_(computer_programming)) `visit()` around them.
35 changes: 35 additions & 0 deletions .changeset/purple-cougars-unite.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,35 @@
---
'@data-client/normalizr': minor
'@data-client/endpoint': minor
'@data-client/graphql': minor
'@data-client/rest': minor
'@data-client/react': patch
'@data-client/core': patch
---

Change Schema.normalize interface from direct data access, to using functions like `getEntity`

```ts
interface SchemaSimple {
normalize(
input: any,
parent: any,
key: any,
args: any[],
visit: (schema: any, value: any, parent: any, key: any, args: readonly any[]) => any,
addEntity: (...args: any) => any,
getEntity: (...args: any) => any,
checkLoop: (...args: any) => any,
): any;
}
```

We also add `checkLoop()`, which moves some logic in [Entity](https://dataclient.io/rest/api/Entity)
to the core normalize algorithm.

```ts
/** Returns true if a circular reference is found */
export interface CheckLoop {
(entityKey: string, pk: string, input: object): boolean;
}
```
23 changes: 23 additions & 0 deletions .changeset/purple-cougars-unite2.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,23 @@
---
'@data-client/normalizr': minor
'@data-client/endpoint': minor
'@data-client/graphql': minor
'@data-client/rest': minor
'@data-client/react': patch
'@data-client/core': patch
---

Change Schema.denormalize `unvisit` to have [schema](https://dataclient.io/rest/api/schema) argument first.

```ts
interface SchemaSimple {
denormalize(
input: {},
args: readonly any[],
unvisit: (schema: any, input: any) => any,
): T;
}
```



26 changes: 26 additions & 0 deletions .changeset/rich-frogs-move.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,26 @@
---
'@data-client/normalizr': minor
---

Change normalize() interface

```ts
function normalize(
schema,
input,
{ date, expiresAt, fetchedAt, args },
{ entities, indexes, entityMeta },
);
```

#### Usage

```ts
const { result, entities, indexes, entityMeta } = normalize(
action.endpoint.schema,
payload,
action.meta,
state,
);
```

15 changes: 15 additions & 0 deletions .changeset/sharp-birds-tie.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,15 @@
---
'@data-client/normalizr': minor
---

Change denormalize() interface

```ts
function denormalize(schema, input, entities, args);
```

#### Usage

```ts
const value = denormalize(endpoint.schema, input, state.entities, args);
```
7 changes: 7 additions & 0 deletions .changeset/silly-eagles-knock.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
---
'@data-client/test': patch
'@data-client/img': patch
'@data-client/ssr': patch
---

Expand peerdep support range to include ^0.14.0
13 changes: 13 additions & 0 deletions .changeset/smooth-houses-tickle.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
---
'@data-client/normalizr': minor
---

Change MemoCache methods interface

```ts
class MemoCache {
denormalize(schema, input, entities, args): { data, paths };
query(schema, args, entities, indexes): data;
buildQueryKey(schema, args, entities, indexes): normalized;
}
```
2 changes: 1 addition & 1 deletion .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -73,4 +73,4 @@ typings/
# build info
**/tsconfig*.tsbuildinfo

/codemods/
/codemods
12 changes: 4 additions & 8 deletions docs/rest/api/schema.md
Original file line number Diff line number Diff line change
Expand Up @@ -129,7 +129,7 @@ class Article extends Entity {
import { normalize } from '@data-client/normalizr';

const args = [{ id: '123' }];
const normalizedData = normalize(originalData, Article, args);
const normalizedData = normalize(Article, originalData, args);
```

Now, `normalizedData` will create a single serializable source of truth for all entities:
Expand Down Expand Up @@ -175,8 +175,8 @@ Now, `normalizedData` will create a single serializable source of truth for all
import { denormalize } from '@data-client/normalizr';

const denormalizedData = denormalize(
normalizedData.result,
Article,
normalizedData.result,
normalizedData.entities,
args,
);
Expand Down Expand Up @@ -218,14 +218,14 @@ import { MemoCache } from '@data-client/normalizr';
const memo = new MemoCache();

const { data, paths } = memo.denormalize(
normalizedData.result,
Article,
normalizedData.result,
normalizedData.entities,
args,
);
const { data: data2 } = memo.denormalize(
normalizedData.result,
Article,
normalizedData.result,
normalizedData.entities,
args,
);
Expand All @@ -242,11 +242,7 @@ is an Array of paths of all entities included in the result.
`memo.query()` allows denormalizing [Queryable](#queryable) based on args alone, rather than a normalized input.

```ts
// key is just any serialization of args
const key = JSON.stringify(args);

const data = memo.query(
key,
Article,
args,
normalizedData.entities,
Expand Down
2 changes: 1 addition & 1 deletion examples/benchmark/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@ Performance compared to normalizr package (higher is better):

| | no cache | with cache |
| ------------------- | -------- | ---------- |
| normalize (long) | 113% | 113% |
| normalize (long) | 119% | 119% |
| denormalize (long) | 158% | 1,262% |
| denormalize (short) | 676% | 2,367% |

Expand Down
46 changes: 18 additions & 28 deletions examples/benchmark/normalizr.js
Original file line number Diff line number Diff line change
Expand Up @@ -18,111 +18,101 @@ import {
} from './schemas.js';
import userData from './user.json' with { type: 'json' };

const { result, entities } = normalize(data, ProjectSchema);
const queryState = normalize(data, ProjectQuery);
const { result, entities } = normalize(ProjectSchema, data);
const queryState = normalize(ProjectQuery, data);
const queryMemo = new MemoCache();
queryState.result = queryMemo.buildQueryKey(
'',
ProjectQuery,
[],
queryState.entities,
queryState.indexes,
);
const queryInfer = queryMemo.buildQueryKey(
'',
ProjectQuerySorted,
[],
queryState.entities,
queryState.indexes,
);

let githubState = normalize(userData, User);
let githubState = normalize(User, userData);

const actionMeta = {
fetchedAt: Date.now(),
date: Date.now(),
expiresAt: Date.now() + 10000000,
args: [],
};

export default function addNormlizrSuite(suite) {
const memo = new MemoCache();
// prime the cache
memo.denormalize(result, ProjectSchema, entities, []);
memo.denormalize(queryState.result, ProjectQuery, queryState.entities, []);
memo.denormalize(ProjectSchema, result, entities, []);
memo.denormalize(ProjectQuery, queryState.result, queryState.entities, []);
%OptimizeFunctionOnNextCall(memo.denormalize);
%OptimizeFunctionOnNextCall(denormalize);
%OptimizeFunctionOnNextCall(normalize);

let curState = initialState;
return suite
.add('normalizeLong', () => {
normalize(
data,
ProjectSchema,
[],
curState.entities,
curState.indexes,
curState.entityMeta,
actionMeta,
);
normalize(ProjectSchema, data, actionMeta, curState);
curState = { ...initialState, entities: {}, endpoints: {} };
})
.add('infer All', () => {
return new MemoCache().buildQueryKey(
'',
ProjectQuery,
[],
queryState.entities,
queryState.indexes,
);
})
.add('denormalizeLong', () => {
return new MemoCache().denormalize(result, ProjectSchema, entities);
return new MemoCache().denormalize(ProjectSchema, result, entities);
})
.add('denormalizeLong donotcache', () => {
return denormalize(result, ProjectSchema, entities);
return denormalize(ProjectSchema, result, entities);
})
.add('denormalizeShort donotcache 500x', () => {
for (let i = 0; i < 500; ++i) {
denormalize('gnoff', User, githubState.entities);
denormalize(User, 'gnoff', githubState.entities);
}
})
.add('denormalizeShort 500x', () => {
for (let i = 0; i < 500; ++i) {
new MemoCache().denormalize('gnoff', User, githubState.entities);
new MemoCache().denormalize(User, 'gnoff', githubState.entities);
}
})
.add('denormalizeShort 500x withCache', () => {
for (let i = 0; i < 500; ++i) {
memo.denormalize('gnoff', User, githubState.entities, []);
memo.denormalize(User, 'gnoff', githubState.entities, []);
}
})
.add('denormalizeLong with mixin Entity', () => {
return new MemoCache().denormalize(result, ProjectSchemaMixin, entities);
return new MemoCache().denormalize(ProjectSchemaMixin, result, entities);
})
.add('denormalizeLong withCache', () => {
return memo.denormalize(result, ProjectSchema, entities, []);
return memo.denormalize(ProjectSchema, result, entities, []);
})
.add('denormalizeLong All withCache', () => {
return memo.denormalize(
queryState.result,
ProjectQuery,
queryState.result,
queryState.entities,
[],
);
})
.add('denormalizeLong Query-sorted withCache', () => {
return memo.denormalize(
queryInfer,
ProjectQuerySorted,
queryInfer,
queryState.entities,
[],
);
})
.add('denormalizeLongAndShort withEntityCacheOnly', () => {
memo.endpoints = new WeakDependencyMap();
memo.denormalize(result, ProjectSchema, entities);
memo.denormalize('gnoff', User, githubState.entities);
memo.denormalize(ProjectSchema, result, entities);
memo.denormalize(User, 'gnoff', githubState.entities);
})
.on('complete', function () {
if (process.env.SHOW_OPTIMIZATION) {
Expand Down
2 changes: 1 addition & 1 deletion examples/normalizr-github/index.js
Original file line number Diff line number Diff line change
Expand Up @@ -22,8 +22,8 @@ const request = https.request(

res.on('end', () => {
const normalizedData = normalize(
JSON.parse(data),
schema.IssueOrPullRequest,
JSON.parse(data),
);
const out = JSON.stringify(normalizedData, null, 2);
fs.writeFileSync(path.resolve(__dirname, './output.json'), out);
Expand Down
4 changes: 2 additions & 2 deletions examples/normalizr-redux/src/redux/modules/commits.js
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@ export const getCommits =
repo,
})
.then(response => {
const data = normalize(response.data, [schema.Commit]);
const data = normalize([schema.Commit], response.data);
dispatch(addEntities(data.entities));
return response;
})
Expand All @@ -40,4 +40,4 @@ export const getCommits =
});
};

export const selectHydrated = (state, id) => denormalize(id, Commit, state);
export const selectHydrated = (state, id) => denormalize(Commit, id, state);
Loading

0 comments on commit 2ad1811

Please sign in to comment.