Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

MLIBZ-2316: Server Side Delta Set #270

Merged
merged 65 commits into from
May 18, 2018
Merged
Show file tree
Hide file tree
Changes from 12 commits
Commits
Show all changes
65 commits
Select commit Hold shift + click to select a range
0de2af8
MLIBZ-2316: Delete deltaFetch.js
Mar 14, 2018
4c94c27
MLIBZ-2316: Add deltaset.js
Mar 14, 2018
50c07d5
MLIBZ-2316: Remove export of deltaFetch
Mar 14, 2018
4268547
MLIBZ-2316: Use new deltaSet function in NetworkRepository
Mar 14, 2018
5c52d42
Remove TODO comment
Mar 14, 2018
c8cbdee
MLIBZ-2316: Add clearDeltaSet function
Mar 14, 2018
a968f77
MLIBZ-2316: Add tests for using delta set with a CacheStore
Mar 14, 2018
4df7ded
MLIBZ-2316: Rename useDeltaFetch to useDeltaSet
Mar 14, 2018
82084f6
MLIBZ-2316: Add delta set constants
Mar 16, 2018
9c0c06b
MLIBZ-2316: Split delta set logic into smaller functions
Mar 16, 2018
c0ab3cd
MLIBZ-2316: Add tests for delta set
Mar 16, 2018
8d69573
Update package versions to 3.10.1-beta.1
Mar 16, 2018
e8ef15f
Make requested changes from code review for delta set
Mar 22, 2018
873c2c1
Don't perform _replaceOfflineEntities() when using delta set
Mar 22, 2018
05a5957
Fix typo and change name from xKinveyRequestStartHeader to kinveyRequ…
Mar 23, 2018
e580555
Don't duplicate function calls in SyncManager.pull()
Mar 23, 2018
1899e2d
Fix check for if statment for SyncManager.pull()
Mar 23, 2018
27dc2a7
Only replace entities if useDeltaSet is not true
Mar 23, 2018
d3a3a01
Default options to an empty object in SyncManager.pull()
Mar 23, 2018
1f6f40f
Fix failing unit tests related to changes
Mar 23, 2018
1ff68a1
Merge branch 'master' into MLIBZ-2316_server_side_delta_set
Mar 26, 2018
89467e4
Add checks for skip amd limit modifiers to delta set
Mar 27, 2018
c13d78a
MLIBZ-2316: Handle skip and limit for Delta Set
Mar 28, 2018
fce6bf7
Update logged message for delta set
Mar 28, 2018
8184843
Use original query when performing regular GET request for delta set
Mar 29, 2018
d57ff1e
Merge branch 'master' into MLIBZ-2316_server_side_delta_set
thomasconner Apr 3, 2018
1af0d2d
MLIBZ-2316: Send a regular GET request if a query contains a skip and…
thomasconner Apr 5, 2018
9379afd
MLIBZ-2437: Send a regular GET request if a ParamaterValueOutOfRangeE…
thomasconner Apr 5, 2018
68865da
Add missing curly brace
thomasconner Apr 5, 2018
78476e2
Fixing failing tests
thomasconner Apr 5, 2018
ed0bcf8
Add InvalidCachedQuery error
thomasconner Apr 20, 2018
a12d70a
Export InvalidCachedQuery error
thomasconner Apr 20, 2018
14e88be
Add querycache.js
thomasconner Apr 20, 2018
dff6376
Add a getter for X-Kinvey-Request-Start header to Headers class
thomasconner Apr 20, 2018
5024ce1
Remove constants
thomasconner Apr 20, 2018
8a3f71e
Refactor SyncManager to use new deltaset and querycache files
thomasconner Apr 20, 2018
6c4ed9b
Use requestStart getter from headers
thomasconner Apr 20, 2018
d43c036
Fix failing unit tests
thomasconner Apr 23, 2018
dfe3a40
Bug fixes for Delta Set
thomasconner Apr 25, 2018
f93a726
Fix some bugs found during testing
thomasconner Apr 25, 2018
cfce9ac
Merge branch 'master' into MLIBZ-2316_server_side_delta_set
thomasconner Apr 25, 2018
7d35f64
Merge branch 'master' into MLIBZ-2316_server_side_delta_set
thomasconner May 1, 2018
fdbe8ec
Merge branch 'master' into MLIBZ-2316_server_side_delta_set
thomasconner May 3, 2018
cc466e9
Fix skipped unit tests
thomasconner May 3, 2018
f5c58c8
Merge branch 'master' into MLIBZ-2316_server_side_delta_set
thomasconner May 4, 2018
aa0e1ca
Merge branch 'master' into MLIBZ-2316_server_side_delta_set
thomasconner May 8, 2018
ee18586
MLIBZ-2478: Return count of items changed when making a _deltaset req…
thomasconner May 8, 2018
0ab65a7
Merge branch 'MLIBZ-2478' into MLIBZ-2316_server_side_delta_set
thomasconner May 8, 2018
022dd26
MLIBZ-2477: Handle NotFoundError when deleting a cachedQuery
thomasconner May 8, 2018
86262f2
Merge branch 'MLIBZ-2477' into MLIBZ-2316_server_side_delta_set
thomasconner May 8, 2018
16b5738
MLIBZ-2486: Ignore limit and skip for auto pagination and update tests
thomasconner May 9, 2018
270fe24
Merge branch 'MLIBZ-2486' into MLIBZ-2316_server_side_delta_set
thomasconner May 9, 2018
25bd9ae
MLIBZ-2493: Perform delta set delete and then delta set changed seria…
thomasconner May 9, 2018
c43d78c
Merge branch 'MLIBZ-2493' into MLIBZ-2316_server_side_delta_set
thomasconner May 9, 2018
2e36219
MLIBZ-2449: Do not filter out _QueryCache table or __testSupport__ ta…
thomasconner May 10, 2018
315e335
Merge branch 'MLIBZ-2449' into MLIBZ-2316_server_side_delta_set
thomasconner May 10, 2018
e89ebc0
MLIBZ-2495: Ignore deleting entities when limit and skip are used to …
thomasconner May 11, 2018
87c86fc
Merge branch 'MLIBZ-2495' into MLIBZ-2316_server_side_delta_set
thomasconner May 11, 2018
bb358b8
Merge branch 'master' into MLIBZ-2316_server_side_delta_set
thomasconner May 14, 2018
1d6775a
Update _queryCache after auto pagination has completed
thomasconner May 18, 2018
2fbc5f5
MLIBZ-2512: Ignore skip value of query for auto pagination
thomasconner May 18, 2018
26e52e3
Merge branch 'MLIBZ-2512' into MLIBZ-2316_server_side_delta_set
thomasconner May 18, 2018
8766cb6
Fix failing unit test that checks if skip is ignored for auto pagination
thomasconner May 18, 2018
40318c7
Merge branch 'master' into MLIBZ-2316_server_side_delta_set
thomasconner May 18, 2018
5152da3
Added deltaset tests (#294)
May 18, 2018
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion packages/kinvey-angular-sdk/package.json
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
{
"private": true,
"version": "3.10.1",
"version": "3.10.1-beta.1",
"name": "kinvey-angular-sdk",
"description": "Kinvey JavaScript SDK for Angular/Ionic applications.",
"author": "Kinvey, Inc.",
Expand Down
2 changes: 1 addition & 1 deletion packages/kinvey-angular2-sdk/package.json
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
{
"private": true,
"version": "3.10.1",
"version": "3.10.1-beta.1",
"name": "kinvey-angular2-sdk",
"description": "Kinvey JavaScript SDK for Angular 2+/Ionic 2+ applications.",
"author": "Kinvey, Inc.",
Expand Down
2 changes: 1 addition & 1 deletion packages/kinvey-html5-sdk/package.json
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
{
"private": true,
"version": "3.10.1",
"version": "3.10.1-beta.1",
"name": "kinvey-html5-sdk",
"description": "Kinvey JavaScript SDK for HTML5 applications.",
"author": "Kinvey, Inc.",
Expand Down
2 changes: 1 addition & 1 deletion packages/kinvey-js-sdk/package.json
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
{
"private": true,
"version": "3.10.1",
"version": "3.10.1-beta.1",
"name": "kinvey-js-sdk",
"description": "Kinvey JavaScript SDK for developing JavaScript applications.",
"author": "Kinvey, Inc.",
Expand Down
2 changes: 1 addition & 1 deletion packages/kinvey-nativescript-sdk/package.json
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
{
"private": true,
"version": "3.10.1",
"version": "3.10.1-beta.1",
"name": "kinvey-nativescript-sdk",
"description": "Kinvey NativeScript SDK for developing NativeScript applications.",
"author": "Kinvey, Inc.",
Expand Down
2 changes: 1 addition & 1 deletion packages/kinvey-node-sdk/package.json
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
{
"private": true,
"version": "3.10.1",
"version": "3.10.1-beta.1",
"name": "kinvey-node-sdk",
"description": "Kinvey JavaScript SDK for Node.js applications.",
"author": "Kinvey, Inc.",
Expand Down
2 changes: 1 addition & 1 deletion packages/kinvey-phonegap-sdk/package.json
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
{
"private": true,
"version": "3.10.1",
"version": "3.10.1-beta.1",
"name": "kinvey-phonegap-sdk",
"description": "Kinvey JavaScript SDK for Cordova/PhoneGap applications.",
"author": "Kinvey, Inc.",
Expand Down
9 changes: 7 additions & 2 deletions src/core/datastore/cachestore.js
Original file line number Diff line number Diff line change
Expand Up @@ -24,6 +24,11 @@ export class CacheStore extends NetworkStore {
*/
this.ttl = options.ttl || undefined;
this.syncManager = syncManagerProvider.getSyncManager();

/**
* @type {boolean}
*/
this.useDeltaSet = options.useDeltaSet === true;
}

/**
Expand Down Expand Up @@ -110,7 +115,7 @@ export class CacheStore extends NetworkStore {
* @return {Promise.<number>} Promise
*/
pull(query, options = {}) {
options = assign({ useDeltaFetch: this.useDeltaFetch }, options);
options = assign({ useDeltaSet: this.useDeltaSet }, options);
return this.syncManager.getSyncItemCountByEntityQuery(this.collection, query)
.then((count) => {
if (count > 0) {
Expand All @@ -133,7 +138,7 @@ export class CacheStore extends NetworkStore {
* @return {Promise.<{push: [], pull: number}>} Promise
*/
sync(query, options) {
options = assign({ useDeltaFetch: this.useDeltaFetch }, options);
options = assign({ useDeltaSet: this.useDeltaSet }, options);
const result = {};
return this.push(query, options)
.then((pushResult) => {
Expand Down
134 changes: 134 additions & 0 deletions src/core/datastore/cachestore.spec.js
Original file line number Diff line number Diff line change
Expand Up @@ -190,6 +190,82 @@ describe('CacheStore', () => {
});
});

it('should perform a delta set request', (done) => {
const entity1 = { _id: randomString() };
const entity2 = { _id: randomString() };
const store = new CacheStore(collection, null, { useDeltaSet: true });
const onNextSpy = expect.createSpy();
const lastRequestDate = new Date();

nock(store.client.apiHostname)
.get(`/appdata/${store.client.appKey}/${collection}`)
.reply(200, [entity1, entity2], {
'X-Kinvey-Request-Start': lastRequestDate.toISOString()
});

store.pull()
.then(() => {
const changedEntity2 = Object.assign({}, entity2, { title: 'test' });
nock(client.apiHostname)
.get(`/appdata/${store.client.appKey}/${collection}/_deltaset`)
.query({ since: lastRequestDate.toISOString() })
.reply(200, { changed: [changedEntity2], deleted: [{ _id: entity1._id }] }, {
'X-Kinvey-Request-Start': new Date().toISOString()
});

store.find()
.subscribe(onNextSpy, done, () => {
try {
expect(onNextSpy.calls.length).toEqual(2);
expect(onNextSpy.calls[0].arguments).toEqual([[entity1, entity2]]);
expect(onNextSpy.calls[1].arguments).toEqual([[changedEntity2]]);
done();
} catch (error) {
done(error);
}
});
})
.catch(done);
});

it('should perform a delta set request with a tagged datastore', (done) => {
const entity1 = { _id: randomString() };
const entity2 = { _id: randomString() };
const store = new CacheStore(collection, null, { useDeltaSet: true, tag: randomString() });
const onNextSpy = expect.createSpy();
const lastRequestDate = new Date();

nock(store.client.apiHostname)
.get(`/appdata/${store.client.appKey}/${collection}`)
.reply(200, [entity1, entity2], {
'X-Kinvey-Request-Start': lastRequestDate.toISOString()
});

store.pull()
.then(() => {
const changedEntity2 = Object.assign({}, entity2, { title: 'test' });
nock(client.apiHostname)
.get(`/appdata/${store.client.appKey}/${collection}/_deltaset`)
.query({ since: lastRequestDate.toISOString() })
.reply(200, { changed: [changedEntity2], deleted: [{ _id: entity1._id }] }, {
'X-Kinvey-Request-Start': new Date().toISOString()
});

store.find()
.subscribe(onNextSpy, done, () => {
try {
expect(onNextSpy.calls.length).toEqual(2);
expect(onNextSpy.calls[0].arguments).toEqual([[entity1, entity2]]);
expect(onNextSpy.calls[1].arguments).toEqual([[changedEntity2]]);
done();
} catch (error) {
done(error);
}
});
})
.catch(done);
});

it('should remove entities that no longer exist on the backend from the cache', (done) => {
const entity1 = { _id: randomString() };
const entity2 = { _id: randomString() };
Expand Down Expand Up @@ -1140,6 +1216,64 @@ describe('CacheStore', () => {
expect(entities).toEqual([entity1, entity2]);
});
});

it('should perform a delta set request', () => {
const entity1 = { _id: randomString() };
const entity2 = { _id: randomString() };
const store = new CacheStore(collection, null, { useDeltaSet: true });
const lastRequestDate = new Date();

nock(store.client.apiHostname)
.get(`/appdata/${store.client.appKey}/${collection}`)
.reply(200, [entity1, entity2], {
'X-Kinvey-Request-Start': lastRequestDate.toISOString()
});

return store.pull()
.then(() => {
const changedEntity2 = Object.assign({}, entity2, { title: 'test' });
nock(client.apiHostname)
.get(`/appdata/${store.client.appKey}/${collection}/_deltaset`)
.query({ since: lastRequestDate.toISOString() })
.reply(200, { changed: [changedEntity2], deleted: [{ _id: entity1._id }] }, {
'X-Kinvey-Request-Start': new Date().toISOString()
});

store.pull()
.then((count) => {
expect(count).toEqual(1);
});
});
});

it('should perform a delta set request with a tagged datastore', () => {
const entity1 = { _id: randomString() };
const entity2 = { _id: randomString() };
const store = new CacheStore(collection, null, { useDeltaSet: true, tag: randomString() });
const lastRequestDate = new Date();

nock(store.client.apiHostname)
.get(`/appdata/${store.client.appKey}/${collection}`)
.reply(200, [entity1, entity2], {
'X-Kinvey-Request-Start': lastRequestDate.toISOString()
});

return store.pull()
.then(() => {
const changedEntity2 = Object.assign({}, entity2, { title: 'test' });
nock(client.apiHostname)
.get(`/appdata/${store.client.appKey}/${collection}/_deltaset`)
.query({ since: lastRequestDate.toISOString() })
.reply(200, { changed: [changedEntity2], deleted: [{ _id: entity1._id }] }, {
'X-Kinvey-Request-Start': new Date().toISOString()
});

store.pull()
.then((count) => {
expect(count).toEqual(1);
});
});
});
});

describe('sync()', () => {
Expand Down
159 changes: 159 additions & 0 deletions src/core/datastore/deltaset.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,159 @@
import isArray from 'lodash/isArray';
import { format } from 'url';
import { Query } from '../query';
import { KinveyRequest, RequestMethod, AuthType } from '../request';
import { Client } from '../client';
import { KinveyError } from '../errors';
import { repositoryProvider } from './repositories';
import { buildCollectionUrl } from './repositories/utils';
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The reason utils are not reexported from /repositories/index.js is that they were local, used only there. Once something is being used elsewhere, it should move to another utils folder, some levels up, at the lowest level, where it can be shared accordingly.

import {
stripTagFromCollectionName,
generateEntityId,
queryCacheCollectionName,
xKivneyRequestStartHeader
} from './utils';

function getDeltaSetQuery(collectionNameWithTag, query) {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Don't need the "withTag" part of the variable name. We're trying to not think about that. The only place which knows about the tag is the buildCollectionUrl() function. Not even the entire class.

const serializedQuery = query ? query.toString() : '';

return repositoryProvider.getOfflineRepository()
.then((offlineRepo) => {
const queryCacheQuery = new Query()
.equalTo('collectionName', collectionNameWithTag)
.and()
.equalTo('query', serializedQuery);
return offlineRepo.read(queryCacheCollectionName, queryCacheQuery)
Copy link
Contributor

@gprodanov gprodanov Mar 19, 2018

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I now remember we went a different direction with the kinvey_sync collection - it is being tagged like any other collection. Here with the "query cache" collection, we are only scoping the collection name "column" of its "rows". Should we may be unify our approach?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't understand what you mean. Can you explain more?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I mean that when you have a data store with no tag, it uses a collection kinvey_sync for the sync queue. If you use a data store with a tag, for the same collection, it uses a collection called kinvey_sync.<tag-name> for the sync queue.

The same two cases, regarding the _QueryCache collection, would use the same collection. Just the values for the collection "column" would be different. The same could have been applied to the kinvey_sync collection, but we went a different route. I'm saying we should decide which is better, and a unified behaviour might be a consideration in this decision.

.then((deltaSetQueryDocs = []) => {
if (deltaSetQueryDocs.length > 0) {
return deltaSetQueryDocs[0];
}

return {
_id: generateEntityId(),
collectionName: collectionNameWithTag,
query: serializedQuery
};
});
});
}

function updateDeltaSetQuery(deltaSetQuery, response) {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This function doesn't seem to need the response. It only needs the header value.

return repositoryProvider.getOfflineRepository()
.then((offlineRepo) => {
deltaSetQuery.lastRequest = response.headers.get(xKivneyRequestStartHeader);
return offlineRepo.update(queryCacheCollectionName, deltaSetQuery);
});
}

function readDocs(collectionNameWithTag, query) {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think we've used the term "docs" anywhere before - why start now?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

"entities" is the term we generally use.

return repositoryProvider.getOfflineRepository()
.then((offlineRepo) => {
return offlineRepo.read(collectionNameWithTag, query);
});
}

function deleteDocs(collectionNameWithTag, deleted = []) {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This method signature is confusing. Is the purpose of the method to delete the entities that are passed in as an array parameter? If so, it's clearer to name that variable something like entitiesToDelete rather than deleted.

return repositoryProvider.getOfflineRepository()
.then((offlineRepo) => {
if (isArray(deleted) && deleted.length > 0) {
const deletedIds = deleted.map((doc) => doc._id);
const deleteQuery = new Query().contains('_id', deletedIds);
return offlineRepo.delete(collectionNameWithTag, deleteQuery)
}

return deleted;
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

offlineRepo.delete() returns the number of deleted entities. deleted can be null/undefined, it seems. Perhaps the method should return 0, to keep return types consistent.

});
}

function updateDocs(collectionNameWithTag, changed = []) {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same as comments for deleteDocs.

return repositoryProvider.getOfflineRepository()
.then((offlineRepo) => {
if (isArray(changed) && changed.length > 0) {
return offlineRepo.update(collectionNameWithTag, changed)
}

return changed;
});
}

function makeDeltaSetRequest(collectionNameWithTag, deltaSetQuery, query, options) {
const collectionName = stripTagFromCollectionName(collectionNameWithTag);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Like we said, only buildCollectionUrl() knows about tags.

const client = Client.sharedInstance();
const request = new KinveyRequest({
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why not use KinveyRequest.execute()? 17 lines for one http request seems excessive. Most of these options and attributes being listed in the config object are never used, not documented, or not cohesively supported.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@georgiwe I don't completely understand your comment. What KinveyRequest.execute() call are you referring to? Is there an example.

I agree in principle with your point, though. It seems like a bunch of these options (skipBL, timeout etc) would apply to all data requests, so some way to avoid listing them out one-by-one will be useful.

Copy link
Contributor

@gprodanov gprodanov Mar 20, 2018

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I mean like here. It basically takes care of all these defaults which are used over 90% of the time, extracting the data, etc. Just getting rid of boilerplate.

But perhaps you meant something else. I was saying ignore things like skipBL, since they are not documented (reference). We can expand on the utilities around making a request, but about a quarter of the old data store classes were code which made requests, many of these options either not documented/supported or rarely used, and got copied around the codebase.

authType: AuthType.Default,
method: RequestMethod.GET,
url: format({
protocol: client.apiProtocol,
host: client.apiHost,
pathname: buildCollectionUrl(collectionName, '_deltaset'),
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

'_deltaset' should be the third argument - please check the signature of buildCollectionUrl().

query: { since: deltaSetQuery.lastRequest }
}),
query,
timeout: options.timeout,
followRedirect: options.followRedirect,
cache: options.cache,
properties: options.properties,
skipBL: options.skipBL,
trace: options.trace,
client
});

return request.execute()
.then((response) => {
return updateDeltaSetQuery(deltaSetQuery, response).then(() => response.data);
})
.then((data) => {
return deleteDocs(collectionNameWithTag, data.deleted)
.then(() => updateDocs(collectionNameWithTag, data.changed))
.then(() => data);
});
}

function makeRequest(collectionNameWithTag, deltaSetQuery, query, options) {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This function can very easily be split into two - it does two (and a half) things - performs a delta set request, or a regular get request (and the "half" is managing how local store should be updated). It's clearly visible where one, self-sufficient part ends (the first return). This should be a separate function.

The name "makeRequest" even suggests things should be different - if you can't think of a descriptive name for a function, it means it's not clear what it does. Which means it should change.

I think the way to solve this is split makeRequest into all of its small operations and then "merge" makeRequest with deltaSet. The resulting function would delegate all operations to other functions and handle common context. This is readable and each separate function would be cohesive.

After "a request" is made, both "paths" (which should be functions) call updateDeltaSetQuery(). This would be solved by the above paragraph too.

if (deltaSetQuery && deltaSetQuery.lastRequest) {
if (query && (query.skip != null || query.limit != null)) {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

!query.skip seems better than query.skip != null - it's easy to miss that this isn't !== and this is the comparison operator used in js, most commonly. This hinders readability for no apparent reason.

Like the default value for query.skip is 0 and this is why this works, but it wouldn't work with !==, so this is error prone. I mean I agree the "weak" comparison is more readable than checking multiple values, but done with the ! is better than the != operator.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Another point is that this check would return the error even when doing a CacheStore.find(). I asked this question in the design doc too, but shouldn't this be only for pull()? "skip" and "limit" should be supported when doing a find().

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@georgiwe re: your second point - isn't the idea that limit and skip together with a deltaset request is not supported? If I'm reading this code snippet correctly, making a deltaSetQuery with limit and skip is not supported. I think that's reasonable. Thoughts?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sure, and I didn't think of this at the time, but yesterday I kinda changed my mind. And this may affect the way we treat skip and limit, if we want to keep some consistency.

It seems important to keep .find()s fully functional. In all store types, with all settings. If not, user would need to write their app in a less convenient way and keep track of what works when. If we start throwing an error in CacheStore.find() when skip and limit is passed, and deltaset is enabled, as a developer, it would be difficult to write my app. I may want to make some skip and limit requests - should I disable delta set for those requests only?

One way to keep them, is not send them as part of the delta set request (ignore them), but apply them to the request to the offline store, here. This way we "update the local store", and then read whatever the user is trying to read, from a "synced" place.

This approach may introduce some confusion as it's different from pull(), but I think it isn't too bad.

Does this make sense, or am I missing something?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This makes sense. I think your suggestion also makes sense. @tejasranade thoughts?

Copy link
Contributor

@tejasranade tejasranade Mar 26, 2018

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Are you suggesting that limit and skip not be applied on the delta request to the backend, but get applied on the subsequent cache request? Doesn't that conflict with the typical purpose of limit and skip (which I imagine is to limit the amount of data being retrieved from the backend)?

How about treating deltaSet and limit/skip as mutually exclusive? If limit / skip is specified, we'll just use it in a standard GET request - regardless of whether the store has deltaset enabled. The assumption is that by specifying limit and skip, the developer has decided what slice of data they're looking for.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, this is what I'm suggesting. I don't think it conflicts. I think skip and limit help you target a specific subset of data (just like you said), usually when paging things, but not always. Not just from the server, but from any data source. Cause if you had to pull everything, just to take the last 3 entities, things don't make sense, regardless of the data source.

I guess turning off deltaset would make the API easier to work with, but I think what I suggested is better - we get the best of both worlds. We get performance, we get the expected result and an easy to use API. What do you dislike about it - you feel it's semantically wrong?

Ultimately, I'm fine with whatever you decide - the most important part is to have an easy to use API, which works correctly and as expected.

throw new KinveyError('You cannot use the skip and limit modifiers on the query when performing a delta set request.');
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

makeRequest (it isn't very clear what this request is) returns a promise - it should not be throwing exceptions too. The way to return an error here is return Promise.reject(err). Keeping return types consistent is important.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Aren't they equivalent, in how we're using them? I wasn't quite sure of the difference, so I looked up this post - https://stackoverflow.com/questions/33445415/javascript-promises-reject-vs-throw

Now I think our use of throw (err) seems ok. WDYT?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If an exception is thrown somewhere in a promise chain, it's essentially the same. Like so:

function someFunc() {
  throw new Error('aaa');
}

Promise.resolve()
  .then(() => {
    someFunc();
  })
  .catch(err => {
    console.log('caught in promise.catch()', err.message); // the thrown error
  });

But, if this function is called outside of a promise chain like so:

try {
  someFunc() // throws a synchronous exception
    .catch(err => {
      console.log('err', err.message); // doesn't fire
    });
} catch (err) {
  console.log('caught in catch', err.message);
}

Currently, this particular function is only called in a promise, but it's not robust - if moved outside, things might break unexpectedly.

Another side of the reason is that a promise represents "a value in time". And the reason thrown exceptions are "caught" by promises is precisely this - they shouldn't be synchronous.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oh! Totally forgot about the link you sent, sorry. Yeah, that use case is even trickier - throwing on the next processor tick. It's just a matter of not expecting a synchronous exception, when working with promises. Reading the function doesn't necessarily tell you "I should only use this in promise chains". We should do what essentially the example in the answer you linked does - wrap the exception in a promise.

}

return makeDeltaSetRequest(collectionNameWithTag, deltaSetQuery, query, options);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@georgiwe made a point about the makeRequest naming earlier in the review, and I find it relevant here. If the same class has makeRequest and makeDeltaSetRequest functions, it's not clear what each is doing.

}

const collectionName = stripTagFromCollectionName(collectionNameWithTag);
const client = Client.sharedInstance();
const request = new KinveyRequest({
authType: AuthType.Default,
method: RequestMethod.GET,
url: format({
protocol: client.apiProtocol,
host: client.apiHost,
pathname: buildCollectionUrl(collectionName)
}),
query,
timeout: options.timeout,
followRedirect: options.followRedirect,
cache: options.cache,
properties: options.properties,
skipBL: options.skipBL,
trace: options.trace,
client
});
return request.execute()
.then((response) => {
return updateDeltaSetQuery(deltaSetQuery, response).then(() => response.data);
})
.then((data) => {
return updateDocs(collectionNameWithTag, data);
});
}

export function deltaSet(collectionNameWithTag, query, options) {
return getDeltaSetQuery(collectionNameWithTag, query)
.then((deltaSetQuery) => makeRequest(collectionNameWithTag, deltaSetQuery, query, options))
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There was talk about renaming this "query cache" table to "query log" - is this rejected, @tejasranade ? This would make the name of the variable deltaSetQuery (and related functions) something like queryLogEntry, which is more descriptive in my opinion. Calling it "query" right now is incorrect - it isn't a query, a serialized query is part of it. I know you'll say this is all variable names, but it isn't - it's terms which will be used and they should be well defined.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The design refers to the additional table as _QueryCache. Since there was some concern about whether it was a "cache", I floated the idea of _QueryLog instead. TBH, I suggest we go with _QueryCache because that's what we're using in other platforms.

In that context, we could use the names - getCachedQuery and cachedQuery here.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Alright, _QueryCache it is. Then all that's left is renaming deltaSetQuery to something like "queryCacheEntry", cause cachedQuery sounds like what would be in queryCacheEntry.query. So just one more term to "coin" - what is the entry in the query cache table :)

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Okay, _QueryCache it is. And I think getCachedQuery and cachedQuery work in this context.

Just one more term to define - what is one entry in that table called? Cause it shouldn't be a "query" (and that's kinda why I didn't want the table to be a cache for queries).

.then(() => readDocs(collectionNameWithTag, query))
}

export function clearDeltaSet(collectionNameWithTag) {
const queryCacheQuery = new Query().equalTo('collectionName', collectionNameWithTag);
return repositoryProvider.getOfflineRepository()
.then((offlineRepo) => offlineRepo.delete(queryCacheCollectionName, queryCacheQuery));
}
Loading