Gaming platform for RADA project
The main objective of the current project is to integrate the zoos of the CB region into joint tourist attraction through developing and implementing a cross-border service package for creative adventure learning. It consists of a set of creative online multimedia game-based learning tools and related apps for mobile devices.
This tool set includes:
- an online tool for creating at least four types of location-based interactive assignments relevant for Zoo settings (triggered either by a QR code or GPS-based proximity): augmented reality assignments; quizzes; crossword and memory puzzles; video and photo story telling assignments;
- an online repository of interactive assignments;
- an online environment (Web platform + mobile clients) for composing and conducting location-based GPS adventure games, challenges and tournaments that utilise all these different types of interactive assignments. The platform uses Mozilla Open Badges framework for awarding the achievements of players.
Ellu viidud projekti Mobilitas Pluss MOBEC001 "Piireületav haridusuuendus tehnoloogiapõhiste teadusuuringute toel" tegevuskava raames.
- Requirements are best determined using Server Requirements page of corresponding Laravel 8 version
- Currently used version is 8.x
- PHP version 8.1 (some of the additional modules have strict requirements)
- SSH access to the server (terminal access)
- Composer being installed
- Make sure that composer is installed globally or install it in place
- Clone the repository
- Please make sure to switch to
production
branch if you are running on production server - In case of development instance, please make sure to build the static assets. Refer to working with compiled assets for more information
- Please make sure to switch to
- Move into the directory (
cd rada2
) - Install the dependencies
composer install
, add--no-dev
in production - Make sure that .env file is present and configured as needed (copy .env.example to .env and fill in with data)
- Either create application key manually or do that with a command
php artisan key:generate
- Make sure that
APP_ENV=production
andAPP_DEBUG=false
for production environments (this should prevent unneeded error detailed data exposure) - Create and configure Google and Facebook apps, fill the needed environment variables
- Please make sure to activate the
Google Maps JavaScript API
,Google Maps Embed API
andGoogle+ API
, create the API key for Maps and OAuth credentials - Please make sure that Facebook App is set to
public
mode - The redirect URL addresses are in the form
BASE-URL/auth/PROVIDER-name/callback
- Please make sure to activate the
- Create reCaptcha(V2) keys and fill in the configurations in
.env
file. Constants to look for areNOCAPTCHA_SECRET
andNOCAPTCHA_SITEKEY
- Create and configure Google Analytics and UserReport services. Configuration constant names are
GOOGLE_ANALYTICS_KEY
andUSERREPORT_KEY
- Either create application key manually or do that with a command
- Create the database as needed (according to configuration provided)
- For xAPI integration (step 1)
- Make sure you have set the
QUEUE_DRIVER
in the .env fileQUEUE_DRIVER=database
- Run
php artisan queue:table
from terminal
- Make sure you have set the
- Run
php artisan migrate
from terminal (this should create the database structure and more)- Running
php artisan serve
would serve the app in development (or configure the server of your choice)
- Running
- Run
php artisan db:seed
to ensure that database is filled with required information - For xAPI integration (step 2)
- Run
php artisan queue:work --tries=0 &
from terminal to start the queue worker
- Run
- Make sure to deploy Passport as described in the documentation
- Only keys would be needed by default
- Make sure to define API access permissions in the
.env
file by specifying user identifiers
- Setup scheduled jobs (Cron)
- Those jobs should be run from Command-line interface (CLI) as those might require a longer period of time to run
- Please check the documentation for detailed instructions
- Create API Service account which can access desired Google Analytics project. Documentation
- Add service account json to project
- Add
GOOGLE_SERVICE_ACCOUNT_KEY_JSON_NAME
to .env file which points to the json file added in the previous step
Upgrade procedure is generally simple enough, but could also involve some additional steps to be taken. This could be the case if some of the application logic aspects have changed. An example could be the storage logic structural change, in that particular case the migration has to be applied manually by running php artisan storage:public:migrate
. Another case would be some additions to configurations. Most of those are well described within the commit messages.
The common steps for the process are as follows:
- Log into the server terminal with
ssh
command - Navigate to the service directory (not to be mistaken with the
public
subdirectory) - Run
php artisan down
to temporarily suspend the service - Consider creating a backup of database and current service files
- Run
git pull
to get the latest code from theproduction
branch (or use any other means to get the latest code and replace the current one) - Run
php artisan migrate:status
to determine if migrations are required- If there are any not yet applied migrations, run
php artisan migrate
command - You can check the state of migrations at any time by running the initial command
- If there are any not yet applied migrations, run
- Run
php artisan view:clear
, to clear any view caches - Check if any additional migrations/configurations are required
- See if any modules have been added to requirements, those could be installed by running
php composer install MODULE-NAME
or just runningphp composer update
- See if any configurations have been added and require to be set
- See if there is a need to run any additional migration commands
- See if any modules have been added to requirements, those could be installed by running
- Once you are sure that all required steps have been made, run
php artisan up
to make service available
Working with JS and SASS requires Node.js and Webpack for compilation, generation and management purposes. More information could be found on Laravel Mix documentation pages.
Tasks should run well enough with Node.js versions 12 and 14 (tested to run well enough with both versions). Other versions might be having certain compatibility issues.
If all the dependencies are installed (npm install
), then one would only need to run one of the tasks:
npm run dev
would build assets for development purposesnpm run prod
would build assets for use in productionnpm run watch
would build assets for development and keep watching for changes (CTRL + C to cancel the task in terminal)- Look at
package.json
file for more details
Once the code has reached a stable point and is ready to me run in production, there is a need to merge main
branch into one called production
. The process requires a few steps to be taken:
- Switch to local branch that is tracking upstream
production
- This should have enough instruction to setup the branch from the terminal
- Merge the code from
main
by runninggit merge --strategy-option=theirs main
- The strategy is needed to make sure that no conflicts arise. The strategy does not matter too much as any of the static assets would be rebuilt. It is solely needed because generated/built files for static assets (JS and CSS) would have conflicts, as both branches are building their own versions of those files. The ones in production are minified and made ready for production use
- Make sure to build the production assets by running
npm run prod
, which should add new files to thepublic
directory and its subdirectories. Versioning information is kept with inmix-manifest.json
- In case no changes were made to the base files for JS or CSS, there could be no changes to the static assets. Although, running the process just in case would be a good idea (unless one is absolutely sure that it is not necessary)
- Add all the changes files, commit and push
All translations are placed under ROOT/resources/lang/LANGUAGE
, where LANGUAGE
is one of the following en
, et
and ru
.
I order to create a new translation, please copy all the files from ROOT/resources/lang/en
to newly created directory. NB! Please make sure that the system would need to be configured to recognise that newly added translation.
Please note that files for different translations should be synchronised, meaning, new translations should be added to all of these files and translated right away. The translation system should be able to fall-back to ones for default language, provided those are present.
NB! While translating textual strings please make sure not to remove the replacement parts. Each replacement part could be identified buy having a colon in front of the word: Example: User :name has logged in.
, the :name
is a placeholder that will be substituted with some value and should be left as is in all translations.
Also check any additional translations within the vendor
subdirectory. That would host any translation files that belong to the third-party modules. The logic of creating translations for new languages is the same as described earlier.
If you need to be running in production mode, then please use the branch named production
instead of main
. That branch might not have all the latest changes, but it should have the JS and CSS assets built for running in production (minified and more).
You can easily use Sail or Docker Compose in development. Corresponding docker-compose.yml
is included.
A few commands would allow one to import external image data into the system, build local geocoding index and geocode imported data based on that local positioning table.
import:muinas:data
- fetches data from register.muinas.ee, processes imported entries, extracts image data and creates local ExternalImageResource entries. Imported entries will still miss positioning data. Command will either truncate the table or delete all entries for this provider.import:ajapaik:data
- fetches data from ajapaik.ee, processes imported entries, extracts image data and creates local ExternalImageResource entries. Some entries will get positioning data, but not all. Command will either truncate the table or delete all entries for this provider.build:external:image:data:geocode:index
- builds unique address index based on image data, geocodes using external service and stores local positions on success. Addresses are converted to unique hashes that are later used for lookup. NB! Please note that each geocoding service request will incur additional cost.geocode:external:image:data
- uses local positioning data to set positions to imported external resources that are still missing it.
Please note that it is best to run commands as: import data, build positioning index, geocode imported data. The whole process could take a few hours to complete!
Location-based image search requires ST_Distance_Sphere
to be defined. If one is not present, then a custom function Custom_ST_Distance_Sphere
will be defined instead. This requires permission to create database procedures and functions.
MIT License
Copyright (c) 2021 RADA Project
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.