Skip to content
This repository has been archived by the owner on Feb 12, 2024. It is now read-only.

feat: pin API #1045

Merged
merged 21 commits into from
Jun 19, 2018
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
21 commits
Select commit Hold shift + click to select a range
0662cd8
feat: jsipfs pin improvements (#1249)
JonKrone Mar 16, 2018
ee813eb
revert: default assets are not added when running on a browser
JonKrone Mar 16, 2018
f847f98
feat(test): add tests for failure cases of normalizeHashes
JonKrone Mar 16, 2018
be8747e
fix: add some changes missed during rebase, syntax fixes, etc
JonKrone Mar 19, 2018
beead9a
test: initial work testing the core/pin.js implementation
JonKrone Mar 20, 2018
868570a
refactor: move pin http-api tests to http-api/inject
JonKrone Mar 20, 2018
066c113
test: first draft of pin-set tests
JonKrone Mar 23, 2018
90dad57
feat: simplify root dagnode generation for storeItems base case
JonKrone Mar 30, 2018
17b81ea
feat: rename vars, fix _depth default value, add docs
JonKrone Apr 1, 2018
753f618
feat: parallelize pin.isPinnedWithType
JonKrone Apr 3, 2018
92b3d2f
refactor: refactor pinset.storeItems
JonKrone Apr 3, 2018
a5c556c
fix: re-add pin interface tests
JonKrone Apr 3, 2018
8f7de05
fix: lint
JonKrone Apr 3, 2018
408d2bc
feat: docs, rename resolvePaths, pin.getIndirectKeys now uses eachLimit
JonKrone Apr 5, 2018
f77a379
chore: rebase a month of changes, resolve minor issues from that
JonKrone May 14, 2018
e3013ee
chore: update big.js version
JonKrone May 25, 2018
6c29a3d
revert: do not pin content added with a non-default hash algorithm
JonKrone May 29, 2018
a1b9b93
revert: internalKey recording
JonKrone May 31, 2018
10f40ca
refactor: use lodash.flattenDeep
JonKrone Jun 8, 2018
a7a55b9
refactor: do not expose pinTypes
JonKrone Jun 14, 2018
d0b4a0c
fix: do not destructure node callback results
JonKrone Jun 18, 2018
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
7 changes: 5 additions & 2 deletions package.json
Original file line number Diff line number Diff line change
Expand Up @@ -88,7 +88,7 @@
},
"dependencies": {
"async": "^2.6.0",
"big.js": "^5.0.3",
"big.js": "^5.1.2",
"binary-querystring": "~0.1.2",
"bl": "^1.2.2",
"boom": "^7.2.0",
Expand All @@ -98,13 +98,15 @@
"debug": "^3.1.0",
"file-type": "^7.7.1",
"filesize": "^3.6.1",
"fnv1a": "^1.0.1",
"fsm-event": "^2.1.0",
"get-folder-size": "^1.0.1",
"glob": "^7.1.2",
"hapi": "^16.6.2",
"hapi-set-header": "^1.0.2",
"hoek": "^5.0.3",
"human-to-milliseconds": "^1.0.0",
"interface-datastore": "^0.4.1",
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@JonKrone Is this dependency necessary? If yes, for what?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It is used to generate a Key where we put pin records in the datastore: https://github.com/ipfs/js-ipfs/pull/1045/files#diff-d96275251fef125ec785643a9facfab4R27.

You had previously brought this up in this PR or #107, though I can't seem to find it now. You had suggested that the datastore put, get, .. operations cast strings to Keys automatically. @dignifiedquire What do you think, is that workable?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It should be. Wanna go ahead and make that happen?

"ipfs-api": "^22.0.0",
"ipfs-bitswap": "~0.20.0",
"ipfs-block": "~0.7.1",
Expand Down Expand Up @@ -137,6 +139,7 @@
"libp2p-websocket-star": "~0.8.0",
"libp2p-websockets": "~0.12.0",
"lodash.flatmap": "^4.5.0",
"lodash.flattendeep": "^4.4.0",
"lodash.get": "^4.4.2",
"lodash.set": "^4.3.2",
"lodash.sortby": "^4.7.0",
Expand Down Expand Up @@ -218,7 +221,7 @@
"Jade Meskill <jade.meskill@gmail.com>",
"Johannes Wikner <johannes.wikner@gmail.com>",
"Jon Schlinkert <dev@sellside.com>",
"Jonathan <jkrone@vt.edu>",
"Jonathan Krone <jonathankrone@gmail.com>",
"João Antunes <j.goncalo.antunes@gmail.com>",
"João Santos <joaosantos15@users.noreply.github.com>",
"Kevin Wang <kevin@fossa.io>",
Expand Down
8 changes: 7 additions & 1 deletion src/cli/commands/files/add.js
Original file line number Diff line number Diff line change
Expand Up @@ -173,6 +173,11 @@ module.exports = {
type: 'boolean',
default: false,
describe: 'Write no output'
},
pin: {
type: 'boolean',
default: true,
describe: 'Pin this object when adding'
}
},

Expand All @@ -188,7 +193,8 @@ module.exports = {
rawLeaves: argv.rawLeaves,
onlyHash: argv.onlyHash,
hashAlg: argv.hash,
wrapWithDirectory: argv.wrapWithDirectory
wrapWithDirectory: argv.wrapWithDirectory,
pin: argv.pin
}

// Temporary restriction on raw-leaves:
Expand Down
15 changes: 15 additions & 0 deletions src/cli/commands/pin.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,15 @@
'use strict'

module.exports = {
command: 'pin',

description: 'Pin and unpin objects to local storage.',

builder (yargs) {
return yargs
.commandDir('pin')
},

handler (argv) {
}
}
29 changes: 29 additions & 0 deletions src/cli/commands/pin/add.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,29 @@
'use strict'

const print = require('../../utils').print

module.exports = {
command: 'add <ipfsPath...>',

describe: 'Pins object to local storage.',

builder: {
recursive: {
type: 'boolean',
alias: 'r',
default: true,
describe: 'Recursively pin the object linked to by the specified object(s).'
}
},

handler (argv) {
const recursive = argv.recursive
const type = recursive ? 'recursive' : 'direct'
argv.ipfs.pin.add(argv.ipfsPath, { recursive: recursive }, (err, results) => {
if (err) { throw err }
results.forEach((res) => {
print(`pinned ${res.hash} ${type}ly`)
})
})
}
}
43 changes: 43 additions & 0 deletions src/cli/commands/pin/ls.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,43 @@
'use strict'

const print = require('../../utils').print

module.exports = {
// bracket syntax with '...' tells yargs to optionally accept a list
command: 'ls [ipfsPath...]',

describe: 'List objects pinned to local storage.',

builder: {
type: {
type: 'string',
alias: 't',
default: 'all',
choices: ['direct', 'indirect', 'recursive', 'all'],
describe: 'The type of pinned keys to list.'
},
quiet: {
type: 'boolean',
alias: 'q',
default: false,
describe: 'Write just hashes of objects.'
}
},

handler: (argv) => {
const paths = argv.ipfsPath
const type = argv.type
const quiet = argv.quiet

argv.ipfs.pin.ls(paths, { type }, (err, results) => {
if (err) { throw err }
results.forEach((res) => {
let line = res.hash
if (!quiet) {
line += ` ${res.type}`
}
print(line)
})
})
}
}
28 changes: 28 additions & 0 deletions src/cli/commands/pin/rm.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,28 @@
'use strict'

const print = require('../../utils').print

module.exports = {
command: 'rm <ipfsPath...>',

describe: 'Removes the pinned object from local storage.',

builder: {
recursive: {
type: 'boolean',
alias: 'r',
default: true,
describe: 'Recursively unpin the objects linked to by the specified object(s).'
}
},

handler: (argv) => {
const recursive = argv.recursive
argv.ipfs.pin.rm(argv.ipfsPath, { recursive: recursive }, (err, results) => {
if (err) { throw err }
results.forEach((res) => {
print(`unpinned ${res.hash}`)
})
})
}
}
1 change: 1 addition & 0 deletions src/core/boot.js
Original file line number Diff line number Diff line change
Expand Up @@ -29,6 +29,7 @@ module.exports = (self) => {

series([
(cb) => self._repo.open(cb),
(cb) => self.pin._load(cb),
(cb) => self.preStart(cb),
(cb) => {
self.log('initialized')
Expand Down
25 changes: 25 additions & 0 deletions src/core/components/dag.js
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,8 @@
const promisify = require('promisify-es6')
const CID = require('cids')
const pull = require('pull-stream')
const mapAsync = require('async/map')
const flattenDeep = require('lodash.flattendeep')

module.exports = function dag (self) {
return {
Expand Down Expand Up @@ -33,6 +35,12 @@ module.exports = function dag (self) {
} else {
path = '/'
}
} else if (Buffer.isBuffer(cid)) {
try {
cid = new CID(cid)
} catch (err) {
return callback(err)
}
}

self._ipld.get(cid, path, options, callback)
Expand Down Expand Up @@ -73,6 +81,23 @@ module.exports = function dag (self) {
self._ipld.treeStream(cid, path, options),
pull.collect(callback)
)
}),

// TODO - use IPLD selectors once they are implemented
_getRecursive: promisify((multihash, callback) => {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this is fine as a first pass, and it'll be great to replace with IPLD selectors when they come along, I'm just wondering, there's no concept of "seen" nodes here, meaning what we could get duplicates in the return value...would that cause issues, do we need to dedupe?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hmm. Good question, I'm not sure.

My first thought is that yes we could hit a loop but when I look at ipld's treeStream implementation (https://github.com/ipld/js-ipld/blob/master/src/index.js#L322), I don't see any checking that we're not traversing into a node we've seen before. Which I would expect ipld to have to watch for if nodes can link to an ancestor. I can also imagine that many problems would arise in serialization, hashing, etc.. if that were possible. Can nodes link to an ancestor?

I think the answer here also effects the seen check in pinset.hasDescendant: https://github.com/ipfs/js-ipfs/pull/1045/files#diff-cd1d776f0354636e2cbc1d609af7fddfR62

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can nodes link to an ancestor?

No because DAG (acyclic).

// gets flat array of all DAGNodes in tree given by multihash

self.dag.get(new CID(multihash), (err, res) => {
if (err) { return callback(err) }

mapAsync(res.value.links, (link, cb) => {
self.dag._getRecursive(link.multihash, cb)
}, (err, nodes) => {
// console.log('nodes:', nodes)
if (err) return callback(err)
callback(null, flattenDeep([res.value, nodes]))
})
})
})
}
}
23 changes: 20 additions & 3 deletions src/core/components/files.js
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,9 @@ function prepareFile (self, opts, file, callback) {
}

waterfall([
(cb) => opts.onlyHash ? cb(null, file) : self.object.get(file.multihash, opts, cb),
(cb) => opts.onlyHash
? cb(null, file)
: self.object.get(file.multihash, opts, cb),
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just a note for the future, please avoid formatting changes for code you're not altering - it makes it harder and more time consuming for me to review as my attention is being drawn away from actual code changes/additions/removals.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Agreed, this was a rebase conflict that I resolved to the previous formatting.

(node, cb) => {
const b58Hash = cid.toBaseEncodedString()

Expand Down Expand Up @@ -87,6 +89,19 @@ function normalizeContent (opts, content) {
})
}

function pinFile (self, opts, file, cb) {
// Pin a file if it is the root dir of a recursive add or the single file
// of a direct add.
const pin = 'pin' in opts ? opts.pin : true
const isRootDir = !file.path.includes('/')
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't know the code to well. Is the path already normalized here? Does it work on Windows?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I believe so, this path is from ipfs-unixfs-engine's export, which passes paths w/ a unix separator: https://github.com/ipfs/js-ipfs-unixfs-engine/blob/master/test/exporter.js#L302

const shouldPin = pin && isRootDir && !opts.onlyHash && !opts.hashAlg
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We can't pin if hashAlg is specified?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, this is related to #1373 (comment). hashAlg creates an incorrect DAG and requires changes to IPLD before we can support it.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If the hashAlg is the default value it would be ok? I think we need to call back with an error if hashAlg is passed, instead of silently not pinning.

Copy link
Contributor

@JonKrone JonKrone Jun 19, 2018

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No, I don't think it'd work. IIRC, it's more like 'IPLD can't handle CIDv1' and anytime --hash is provided there is a cast to v1 here (as it should and is documented in the command's help text).

Example: A node is created and fetched with a v1 key but the blob's internal Hash is v0:
image

I am happy to cb(err) but should we just revert the functionality?

if (shouldPin) {
return self.pin.add(file.hash, err => cb(err, file))
} else {
cb(null, file)
}
}

class AddHelper extends Duplex {
constructor (pullStream, push, options) {
super(Object.assign({ objectMode: true }, options))
Expand Down Expand Up @@ -130,7 +145,8 @@ module.exports = function files (self) {
}

let total = 0
let prog = opts.progress || (() => {})

const prog = opts.progress || noop
const progress = (bytes) => {
total += bytes
prog(total)
Expand All @@ -141,7 +157,8 @@ module.exports = function files (self) {
pull.map(normalizeContent.bind(null, opts)),
pull.flatten(),
importer(self._ipld, opts),
pull.asyncMap(prepareFile.bind(null, self, opts))
pull.asyncMap(prepareFile.bind(null, self, opts)),
pull.asyncMap(pinFile.bind(null, self, opts))
)
}

Expand Down
1 change: 1 addition & 0 deletions src/core/components/index.js
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,7 @@ exports.swarm = require('./swarm')
exports.ping = require('./ping')
exports.pingPullStream = require('./ping-pull-stream')
exports.pingReadableStream = require('./ping-readable-stream')
exports.pin = require('./pin')
exports.files = require('./files')
exports.bitswap = require('./bitswap')
exports.pubsub = require('./pubsub')
Expand Down
23 changes: 9 additions & 14 deletions src/core/components/init-assets.js
Original file line number Diff line number Diff line change
@@ -1,9 +1,7 @@
'use strict'

const path = require('path')
const fs = require('fs')
const glob = require('glob')
const importer = require('ipfs-unixfs-engine').importer
const pull = require('pull-stream')
const file = require('pull-file')
const CID = require('cids')
Expand All @@ -15,23 +13,20 @@ module.exports = function addDefaultAssets (self, log, callback) {

pull(
pull.values([initDocsPath]),
pull.asyncMap((val, cb) => glob(path.join(val, '/**/*'), cb)),
pull.asyncMap((val, cb) =>
glob(path.join(val, '/**/*'), { nodir: true }, cb)
),
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nodir - 👌 nice

pull.flatten(),
pull.map((element) => {
pull.map(element => {
const addPath = element.substring(index + 1)

if (fs.statSync(element).isDirectory()) { return }

return { path: addPath, content: file(element) }
}),
// Filter out directories, which are undefined from above
pull.filter(Boolean),
importer(self._ipld),
pull.through((el) => {
if (el.path === 'init-docs') {
const cid = new CID(el.multihash)
self.files.addPullStream(),
pull.through(file => {
if (file.path === 'init-docs') {
const cid = new CID(file.hash)
log('to get started, enter:\n')
log(`\t jsipfs files cat /ipfs/${cid.toBaseEncodedString()}/readme\n`)
log(`\tjsipfs files cat /ipfs/${cid.toBaseEncodedString()}/readme\n`)
}
}),
pull.collect((err) => {
Expand Down
4 changes: 3 additions & 1 deletion src/core/components/init.js
Original file line number Diff line number Diff line change
Expand Up @@ -85,16 +85,18 @@ module.exports = function init (self) {
return cb(null, true)
}

self.log('adding assets')
const tasks = [
// add empty unixfs dir object (go-ipfs assumes this exists)
(cb) => self.object.new('unixfs-dir', cb)
]

if (typeof addDefaultAssets === 'function') {
// addDefaultAssets is undefined on browsers.
// See package.json browser config
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, we don't add the default assets (initial docs) in the browser to reduce the js payload.

tasks.push((cb) => addDefaultAssets(self, opts.log, cb))
}

self.log('adding assets')
parallel(tasks, (err) => {
if (err) {
cb(err)
Expand Down
Loading