Skip to content

Commit

Permalink
chore: rename master branch to main
Browse files Browse the repository at this point in the history
  • Loading branch information
lpatiny committed May 9, 2024
1 parent 78b369c commit 8bc387a
Show file tree
Hide file tree
Showing 3 changed files with 12 additions and 16 deletions.
4 changes: 1 addition & 3 deletions .github/workflows/nodejs.yml
Original file line number Diff line number Diff line change
Expand Up @@ -3,11 +3,9 @@ name: Node.js CI
on:
push:
branches:
- master
- main
pull_request:

jobs:
nodejs:
uses: zakodium/workflows/.github/workflows/nodejs.yml@nodejs-v1
with:
node-version-matrix: '[14, 16, 18]'
4 changes: 2 additions & 2 deletions .github/workflows/release.yml
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ name: Release
on:
push:
branches:
- master
- main

jobs:
release:
Expand All @@ -12,4 +12,4 @@ jobs:
npm: true
secrets:
github-token: ${{ secrets.BOT_TOKEN }}
npm-token: ${{ secrets.NPM_BOT_TOKEN }}
npm-token: ${{ secrets.NPM_BOT_TOKEN }}
20 changes: 9 additions & 11 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,20 +10,19 @@
This package allows to store spectra in an object and make many processing on them

It should be able to deal with large amount of large files and will automatically reduce the data size in order to keep reasonable size. This reduction of data is based on normalization parameters that includes:
* from / to
* normalization filters
* final number of points
* exclusion zones

- from / to
- normalization filters
- final number of points
- exclusion zones

The package will try by default to have less than 64 Mb of memory used and if the presence of original data yield to a higher amount it will remove the original data.
This will have as consequence that you will not be able to change the normalization parameters.
This will have as consequence that you will not be able to change the normalization parameters.

Practically this means that if you have 1024 spectra of 65536 points and you normalize the data to 4096 points it can still be kept in memory.

* Memory for spectra: 1024 * 65536 * 8 (double): over 64 Mb, original spectra will be removed
* Memory for normalized: 1024 * 4k * 8: 32 Mb: can be processed further


- Memory for spectra: 1024 _ 65536 _ 8 (double): over 64 Mb, original spectra will be removed
- Memory for normalized: 1024 _ 4k _ 8: 32 Mb: can be processed further

## Installation

Expand All @@ -37,7 +36,6 @@ import SpectraProcessor from 'spectra-processor';
let spectrum = SpectraProcessor.addFromJcamp(jcamp);
```


## [API Documentation](https://cheminfo.github.io/spectra-processor/)

## License
Expand All @@ -46,7 +44,7 @@ let spectrum = SpectraProcessor.addFromJcamp(jcamp);

[npm-image]: https://img.shields.io/npm/v/spectra-processor.svg?style=flat-square
[npm-url]: https://www.npmjs.com/package/spectra-processor
[travis-image]: https://img.shields.io/travis/cheminfo/spectra-processor/master.svg?style=flat-square
[travis-image]: https://img.shields.io/travis/cheminfo/spectra-processor/main.svg?style=flat-square
[travis-url]: https://travis-ci.org/cheminfo/spectra-processor
[codecov-image]: https://img.shields.io/codecov/c/github/cheminfo/spectra-processor.svg?style=flat-square
[codecov-url]: https://codecov.io/gh/cheminfo/spectra-processor
Expand Down

0 comments on commit 8bc387a

Please sign in to comment.