Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Jellyfin integration #195

Merged
merged 60 commits into from
Jul 19, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
60 commits
Select commit Hold shift + click to select a range
4e7dcac
perf(backend): use exposed db
IgnisDa Jul 17, 2023
f031779
docs: change position of important docs
IgnisDa Jul 17, 2023
64b0781
feat(backend): basic code for jellyfin integration
IgnisDa Jul 17, 2023
a82f7f1
build(backend): add nanoid deps
IgnisDa Jul 17, 2023
11b5215
feat(backend): generate nanoid application token
IgnisDa Jul 17, 2023
e42827b
feat(backend): add resolver to create jellyfin integration
IgnisDa Jul 17, 2023
0931a43
refactor(backend): use one method to delete all integrations
IgnisDa Jul 17, 2023
8414fec
chore(frontend): adapt to new gql schema
IgnisDa Jul 17, 2023
218051e
refactor(backend): return all integrations from one resolver
IgnisDa Jul 17, 2023
aeae77b
feat(frontend): display all integrations
IgnisDa Jul 17, 2023
9fc6f20
fix(backend): change text returned for integrations
IgnisDa Jul 17, 2023
f4cc0ea
fix(frontend): sink integration not being created
IgnisDa Jul 17, 2023
a57c5d8
docs: add jellyfin to readme
IgnisDa Jul 17, 2023
deed424
docs: add jellyfin integration guide
IgnisDa Jul 17, 2023
973231a
fix(docs): make guides for informative
IgnisDa Jul 17, 2023
10e8404
build(backend): bump version
IgnisDa Jul 17, 2023
d0db184
feat(backend): add endpoint for jellyfin webhook
IgnisDa Jul 17, 2023
30c4872
docs: add instructions for the new plugin
IgnisDa Jul 17, 2023
fceb2ca
feat(backend): jellyfin payload structure for movies
IgnisDa Jul 18, 2023
5541fba
docs: remove useless info
IgnisDa Jul 18, 2023
6e918ef
docs: add info about jellyfin shows
IgnisDa Jul 18, 2023
1886e8c
build(backend): add deps for hashing user id
IgnisDa Jul 18, 2023
f941c4c
feat(backend): use hashed id for user id
IgnisDa Jul 18, 2023
f020557
feat(backend): extract slug from webhook
IgnisDa Jul 18, 2023
3415cab
feat(backend): default value for user sink integrations
IgnisDa Jul 18, 2023
e3e7430
feat(backend): add handling for progress
IgnisDa Jul 18, 2023
58a4ea5
refactor(backend): use macro for decimals
IgnisDa Jul 18, 2023
4c7104d
docs: add info about shows and change webhook url
IgnisDa Jul 18, 2023
936b949
refactor(backend): move parsing logic to service
IgnisDa Jul 18, 2023
703b734
feat(backend): commit jellyfin movie update
IgnisDa Jul 18, 2023
f251cd9
feat(backend): send sink integration info about shows
IgnisDa Jul 18, 2023
3b50f27
fix(backend): decimal for audiobookshelf progress
IgnisDa Jul 18, 2023
0ba38fe
refactor(backend): account for single progress update
IgnisDa Jul 18, 2023
6ab03f3
refactor(backend): extract fn for progress update
IgnisDa Jul 18, 2023
e2ecb36
feat(backend): calculate exact media progress
IgnisDa Jul 18, 2023
9e0f3a9
fix(backend): start to handle finished media
IgnisDa Jul 19, 2023
990e641
feat(backend): add config parameters for duplicate progress input
IgnisDa Jul 19, 2023
48c3cbc
refactor(backend): change memory db type
IgnisDa Jul 19, 2023
ffc6a16
feat(backend): handle intgrations with more than 100% progress
IgnisDa Jul 19, 2023
29bca95
build(backend): add sha deps
IgnisDa Jul 19, 2023
15aac6f
refactor(backend): change order of enums
IgnisDa Jul 19, 2023
a67152e
feat(backend): nest webhook routes
IgnisDa Jul 19, 2023
ceba3c4
refactor(backend): change order of fns
IgnisDa Jul 19, 2023
e76ebaf
try(backend): try hashing approach for progress update
IgnisDa Jul 19, 2023
9811825
Revert "build(backend): add sha deps"
IgnisDa Jul 19, 2023
9c3cf13
Revert "try(backend): try hashing approach for progress update"
IgnisDa Jul 19, 2023
8db5866
refactor(backend): remove auth db ref from gql schema
IgnisDa Jul 19, 2023
340ce66
style(backend): apply clippy lints
IgnisDa Jul 19, 2023
70981f2
build(backend): change TBR version
IgnisDa Jul 19, 2023
aa29f9a
refactor(backend): move struct to where it makes sense
IgnisDa Jul 19, 2023
0c7085d
build(backend): add retainer deps
IgnisDa Jul 19, 2023
73395dc
docs(backend): better explanation of config param
IgnisDa Jul 19, 2023
8bd0bfe
Revert "Revert "build(backend): add sha deps""
IgnisDa Jul 19, 2023
ef4c562
feat(backend): add cache to service
IgnisDa Jul 19, 2023
7f40a81
Revert "Revert "try(backend): try hashing approach for progress update""
IgnisDa Jul 19, 2023
72601b9
fix(backend): change type of config param
IgnisDa Jul 19, 2023
09ef4bb
feat(backend): handle duplicate progress update cases
IgnisDa Jul 19, 2023
7d46b2e
feat(kodi): adapt to new resolver improvements
IgnisDa Jul 19, 2023
86e537d
Revert "Revert "Revert "build(backend): add sha deps"""
IgnisDa Jul 19, 2023
9903957
build(kodi): bump version
IgnisDa Jul 19, 2023
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
43 changes: 42 additions & 1 deletion Cargo.lock

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

4 changes: 2 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -54,9 +54,9 @@ special tool on your computer or phone that lets you keep track of all these dig
## 🚀 Features

- ✅ [Supports](https://github.com/IgnisDa/ryot/discussions/4) tracking media
and fitness.
and fitness
- ✅ Import data from Goodreads, MediaTracker, Trakt, Movary, StoryGraph
- ✅ Integration with Kodi, Audiobookshelf
- ✅ Integration with Jellyfin, Kodi, Audiobookshelf
- ✅ Self-hosted
- ✅ PWA enabled
- ✅ Documented GraphQL API
Expand Down
5 changes: 4 additions & 1 deletion apps/backend/Cargo.toml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
[package]
name = "ryot"
version = "1.13.2"
version = "1.14.0-beta.1"
edition = "2021"
repository = "https://github.com/IgnisDa/ryot"
license = "GPL-V3"
Expand Down Expand Up @@ -29,14 +29,17 @@ dotenvy = "0.15.7"
enum_meta = "0.6.0"
futures = "0.3.28"
graphql_client = "0.13.0"
harsh = "0.2.2"
http = "0.2.9"
http-types = "2.12.0"
isolang = { version = "2.3.0", features = ["list_languages"] }
itertools = "0.10.5"
markdown = "1.0.0-alpha.10"
mime_guess = "2.0.4"
nanoid = "0.4.0"
quick-xml = { version = "0.28.2", features = ["serde", "serialize"] }
regex = "1.8.1"
retainer = "0.3.0"
rust-embed = "6.6.1"
rust_decimal = "1.29.1"
rust_decimal_macros = "1.29.1"
Expand Down
9 changes: 9 additions & 0 deletions apps/backend/src/config.rs
Original file line number Diff line number Diff line change
Expand Up @@ -324,6 +324,9 @@ pub struct IntegrationConfig {
/// every `n` hours.
#[setting(default = 2)]
pub pull_every: i32,
/// The salt used to hash user IDs.
#[setting(default = format!("{}", PROJECT_NAME))]
pub hasher_salt: String,
}

impl IsFeatureEnabled for FileStorageConfig {
Expand Down Expand Up @@ -386,6 +389,11 @@ pub struct ServerConfig {
/// are running the server on `localhost`.
/// [More information](https://github.com/IgnisDa/ryot/issues/23)
pub insecure_cookie: bool,
/// The hours in which a media can be marked as seen again for a user. This
/// is used so that the same media can not be used marked as started when
/// it has been already marked as seen in the last `n` hours.
#[setting(default = 2)]
pub progress_update_threshold: u64,
}

#[derive(Debug, Serialize, Deserialize, Clone, Config)]
Expand Down Expand Up @@ -453,6 +461,7 @@ impl AppConfig {
cl.file_storage.s3_access_key_id = gt();
cl.file_storage.s3_secret_access_key = gt();
cl.file_storage.s3_url = gt();
cl.integration.hasher_salt = gt();
cl.movies.tmdb.access_token = gt();
cl.podcasts.listennotes.api_token = gt();
cl.shows.tmdb.access_token = gt();
Expand Down
4 changes: 3 additions & 1 deletion apps/backend/src/entities/user.rs
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ use serde::{Deserialize, Serialize};

use crate::{
migrator::UserLot,
users::{UserPreferences, UserYankIntegrations},
users::{UserPreferences, UserSinkIntegrations, UserYankIntegrations},
};

fn get_hasher() -> Argon2<'static> {
Expand All @@ -33,6 +33,8 @@ pub struct Model {
pub preferences: UserPreferences,
#[graphql(skip)]
pub yank_integrations: Option<UserYankIntegrations>,
#[graphql(skip)]
pub sink_integrations: UserSinkIntegrations,
}

#[derive(Copy, Clone, Debug, EnumIter, DeriveRelation)]
Expand Down
5 changes: 2 additions & 3 deletions apps/backend/src/graphql.rs
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ use crate::{
fitness::exercise::resolver::{ExerciseMutation, ExerciseQuery},
importer::{ImporterMutation, ImporterQuery},
miscellaneous::resolver::{MiscellaneousMutation, MiscellaneousQuery},
utils::{AppServices, MemoryAuthDb},
utils::AppServices,
};

#[derive(Debug, SimpleObject, Serialize, Deserialize)]
Expand All @@ -21,13 +21,12 @@ pub struct MutationRoot(MiscellaneousMutation, ImporterMutation, ExerciseMutatio

pub type GraphqlSchema = Schema<QueryRoot, MutationRoot, EmptySubscription>;

pub async fn get_schema(app_services: &AppServices, auth_db: MemoryAuthDb) -> GraphqlSchema {
pub async fn get_schema(app_services: &AppServices) -> GraphqlSchema {
Schema::build(
QueryRoot::default(),
MutationRoot::default(),
EmptySubscription,
)
.data(auth_db)
.data(app_services.media_service.clone())
.data(app_services.importer_service.clone())
.data(app_services.exercise_service.clone())
Expand Down
5 changes: 2 additions & 3 deletions apps/backend/src/importer/goodreads.rs
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
use async_graphql::Result;
use chrono::{DateTime, Utc};
use itertools::Itertools;
use rust_decimal::{prelude::FromPrimitive, Decimal};
use rust_decimal::Decimal;
use rust_decimal_macros::dec;
use serde::{Deserialize, Serialize};

Expand Down Expand Up @@ -79,8 +79,7 @@ pub async fn import(input: DeployGoodreadsImportInput) -> Result<ImportResult> {
let rating: Decimal = d.user_rating.parse().unwrap();
if rating != dec!(0) {
// DEV: Rates items out of 5
single_review.rating =
Some(rating.saturating_mul(Decimal::from_u8(20).unwrap()))
single_review.rating = Some(rating.saturating_mul(dec!(20)))
}
};
if single_review.review.is_some() || single_review.rating.is_some() {
Expand Down
7 changes: 3 additions & 4 deletions apps/backend/src/importer/media_tracker.rs
Original file line number Diff line number Diff line change
@@ -1,7 +1,8 @@
// Responsible for importing from https://github.com/bonukai/MediaTracker.

use async_graphql::Result;
use rust_decimal::{prelude::FromPrimitive, Decimal};
use rust_decimal::Decimal;
use rust_decimal_macros::dec;
use sea_orm::prelude::DateTimeUtc;
use serde::{Deserialize, Serialize};
use serde_with::{formats::Flexible, serde_as, TimestampMilliSeconds};
Expand Down Expand Up @@ -295,9 +296,7 @@ pub async fn import(input: DeployMediaTrackerImportInput) -> Result<ImportResult
ImportItemRating {
id: Some(r.id.to_string()),
review,
rating: r
.rating
.map(|d| d.saturating_mul(Decimal::from_u8(20).unwrap())),
rating: r.rating.map(|d| d.saturating_mul(dec!(20))),
}
})),
seen_history: details
Expand Down
25 changes: 14 additions & 11 deletions apps/backend/src/importer/mod.rs
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,8 @@ use crate::{
AddMediaToCollection, CreateOrUpdateCollectionInput, MediaDetails, PostReviewInput,
ProgressUpdateInput,
},
utils::user_id_from_ctx,
traits::AuthProvider,
utils::MemoryDatabase,
};

mod goodreads;
Expand Down Expand Up @@ -170,11 +171,9 @@ impl ImporterQuery {
&self,
gql_ctx: &Context<'_>,
) -> Result<Vec<media_import_report::Model>> {
let user_id = user_id_from_ctx(gql_ctx).await?;
gql_ctx
.data_unchecked::<Arc<ImporterService>>()
.media_import_reports(user_id)
.await
let service = gql_ctx.data_unchecked::<Arc<ImporterService>>();
let user_id = service.user_id_from_ctx(gql_ctx).await?;
service.media_import_reports(user_id).await
}
}

Expand All @@ -189,11 +188,9 @@ impl ImporterMutation {
gql_ctx: &Context<'_>,
input: DeployImportJobInput,
) -> Result<String> {
let user_id = user_id_from_ctx(gql_ctx).await?;
gql_ctx
.data_unchecked::<Arc<ImporterService>>()
.deploy_import_job(user_id, input)
.await
let service = gql_ctx.data_unchecked::<Arc<ImporterService>>();
let user_id = service.user_id_from_ctx(gql_ctx).await?;
service.deploy_import_job(user_id, input).await
}
}

Expand All @@ -203,6 +200,12 @@ pub struct ImporterService {
import_media: SqliteStorage<ImportMedia>,
}

impl AuthProvider for ImporterService {
fn get_auth_db(&self) -> &MemoryDatabase {
self.media_service.get_auth_db()
}
}

impl ImporterService {
#[allow(clippy::too_many_arguments)]
pub fn new(
Expand Down
9 changes: 3 additions & 6 deletions apps/backend/src/importer/movary.rs
Original file line number Diff line number Diff line change
@@ -1,7 +1,8 @@
use async_graphql::Result;
use chrono::{DateTime, NaiveDate, NaiveDateTime, NaiveTime, Utc};
use csv::Reader;
use rust_decimal::{prelude::FromPrimitive, Decimal};
use rust_decimal::Decimal;
use rust_decimal_macros::dec;
use serde::{Deserialize, Serialize};

use crate::{
Expand Down Expand Up @@ -64,11 +65,7 @@ pub async fn import(input: DeployMovaryImportInput) -> Result<ImportResult> {
reviews: vec![ImportItemRating {
id: None,
// DEV: Rates items out of 10
rating: Some(
record
.user_rating
.saturating_mul(Decimal::from_u16(10).unwrap()),
),
rating: Some(record.user_rating.saturating_mul(dec!(10))),
review: None,
}],
collections: vec![],
Expand Down
5 changes: 3 additions & 2 deletions apps/backend/src/importer/story_graph.rs
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,8 @@ use chrono::{DateTime, NaiveDate, NaiveDateTime, NaiveTime, Utc};
use convert_case::{Case, Casing};
use csv::Reader;
use itertools::Itertools;
use rust_decimal::{prelude::FromPrimitive, Decimal};
use rust_decimal::Decimal;
use rust_decimal_macros::dec;
use serde::{Deserialize, Serialize};

use crate::{
Expand Down Expand Up @@ -110,7 +111,7 @@ pub async fn import(
rating: record
.rating
// DEV: Rates items out of 10
.map(|d| d.saturating_mul(Decimal::from_u8(10).unwrap())),
.map(|d| d.saturating_mul(dec!(10))),
review: record.review.map(|r| ImportItemReview {
date: None,
spoiler: false,
Expand Down
Loading