diff --git a/.github/workflows/test.yml b/.github/workflows/test.yml index 999863dd9..bb2441701 100644 --- a/.github/workflows/test.yml +++ b/.github/workflows/test.yml @@ -528,7 +528,7 @@ jobs: report_name: Coverage Backend type: lcov result_path: ./backend/coverage/lcov.info - min_coverage: 65 + min_coverage: 66 token: ${{ github.token }} ########################################################################## diff --git a/.gitignore b/.gitignore index 302fe5578..a2f7ef2b3 100644 --- a/.gitignore +++ b/.gitignore @@ -1,9 +1,14 @@ <<<<<<< HEAD +<<<<<<< HEAD .dbeaver/* ======= .dbeaver .project >>>>>>> refs/remotes/origin/improve-apollo-logging +======= +.dbeaver +.project +>>>>>>> refs/remotes/origin/master *.log /node_modules/* messages.pot diff --git a/README.md b/README.md index 2911c13d3..289a39109 100644 --- a/README.md +++ b/README.md @@ -9,21 +9,26 @@ The Corona crisis has fundamentally changed our world within a very short time. The dominant financial system threatens to fail around the globe, followed by mass insolvencies, record unemployment and abject poverty. Only with a sustainable new monetary system can humanity master these challenges of the 21st century. The Gradido Academy for Bionic Economy has developed such a system. Find out more about the Project on its [Website](https://gradido.net/). It is offering vast resources about the idea. The remaining document will discuss the gradido software only. + ## Software requirements Currently we only support `docker` install instructions to run all services, since many different programming languages and frameworks are used. -- [docker](https://www.docker.com/) +- [docker](https://www.docker.com/) - [docker-compose] +- [yarn](https://phoenixnap.com/kb/yarn-windows) ### For Arch Linux + Install the required packages: + ```bash sudo pacman -S docker sudo pacman -S docker-compose ``` -Add group `docker` and then your user to it in order to allow you to run docker without sudo +Add group `docker` and then your user to it in order to allow you to run docker without sudo + ```bash sudo groupadd docker # may already exist `groupadd: group 'docker' already exists` sudo usermod -aG docker $USER @@ -31,26 +36,58 @@ groups # verify you have the group (requires relog) ``` Start the docker service: + ```bash sudo systemctrl start docker ``` +### For Windows + +#### docker + +The installation of dockers depends on your selected product package from the [dockers page](https://www.docker.com/). For windows the product *docker desktop* will be the choice. Please follow the installation instruction of your selected product. + +##### known problems + +* In case the docker desktop will not start correctly because of previous docker installations, then please clean the used directories of previous docker installation - `C:\Users` - before you retry starting docker desktop. For further problems executing docker desktop please take a look in this description "[logs and trouble shooting](https://docs.docker.com/desktop/windows/troubleshoot/)" +* In case your docker desktop installation causes high memory consumption per vmmem process, then please take a look at this description "[vmmen process consuming too much memory (Docker Desktop)](https://dev.to/tallesl/vmmen-process-consuming-too-much-memory-docker-desktop-273p)" + +#### yarn + +For the Gradido build process the yarn package manager will be used. Please download and install [yarn for windows](https://phoenixnap.com/kb/yarn-windows) by following the instructions there. + ## How to run? +As soon as the software requirements are fulfilled and a docker installation is up and running then open a powershell on Windows or an other commandline prompt on Linux. + +Create and navigate to the directory, where you want to create the Gradido runtime environment. + +``` +mkdir \Gradido +cd \Gradido +``` + ### 1. Clone Sources + Clone the repo and pull all submodules + ```bash git clone git@github.com:gradido/gradido.git git submodule update --recursive --init ``` ### 2. Run docker-compose -Run docker-compose to bring up the development environment + +Run docker-compose to bring up the development environment + ```bash docker-compose up ``` + ### Additional Build options + If you want to build for production you can do this aswell: + ```bash docker-compose -f docker-compose.yml up ``` @@ -73,6 +110,7 @@ A release is tagged on Github by its version number and published as github rele Each release is accompanied with release notes automatically generated from the git log which is available as [CHANGELOG.md](./CHANGELOG.md). To generate the Changelog and set a new Version you should use the following commands in the main folder + ```bash git fetch --all yarn release @@ -85,10 +123,10 @@ Note: The Changelog will be regenerated with all tags on release on the external ## Troubleshooting -| Problem | Issue | Solution | Description | -| ------- | ----- | -------- | ----------- | +| Problem | Issue | Solution | Description | +| ------------------------------------------------ | ---------------------------------------------------- | ----------------------------------------------------------------------------- | --------------------------------------------------------------------------- | | docker-compose raises database connection errors | [#1062](https://github.com/gradido/gradido/issues/1062) | End `ctrl+c` and restart the `docker-compose up` after a successful build | Several Database connection related errors occur in the docker-compose log. | -| Wallet page is empty | [#1063](https://github.com/gradido/gradido/issues/1063) | Accept Cookies and Local Storage in your Browser | The page stays empty when navigating to [http://localhost/](http://localhost/) | +| Wallet page is empty | [#1063](https://github.com/gradido/gradido/issues/1063) | Accept Cookies and Local Storage in your Browser | The page stays empty when navigating to[http://localhost/](http://localhost/) | ## Useful Links diff --git a/admin/.prettierrc.js b/admin/.prettierrc.js index e88113754..bc1d767d7 100644 --- a/admin/.prettierrc.js +++ b/admin/.prettierrc.js @@ -4,5 +4,6 @@ module.exports = { singleQuote: true, trailingComma: "all", tabWidth: 2, - bracketSpacing: true + bracketSpacing: true, + endOfLine: "auto", }; diff --git a/admin/package.json b/admin/package.json index 57711b8be..c649ca752 100644 --- a/admin/package.json +++ b/admin/package.json @@ -4,7 +4,7 @@ "main": "index.js", "author": "Moriz Wahl", "version": "1.8.3", - "license": "MIT", + "license": "Apache-2.0", "private": false, "scripts": { "start": "node run/server.js", diff --git a/admin/src/graphql/searchUsers.js b/admin/src/graphql/searchUsers.js index ddf759031..5740e24cc 100644 --- a/admin/src/graphql/searchUsers.js +++ b/admin/src/graphql/searchUsers.js @@ -5,15 +5,13 @@ export const searchUsers = gql` $searchText: String! $currentPage: Int $pageSize: Int - $filterByActivated: Boolean - $filterByDeleted: Boolean + $filters: SearchUsersFiltersInput ) { searchUsers( searchText: $searchText currentPage: $currentPage pageSize: $pageSize - filterByActivated: $filterByActivated - filterByDeleted: $filterByDeleted + filters: $filters ) { userCount userList { diff --git a/admin/src/pages/Creation.spec.js b/admin/src/pages/Creation.spec.js index 98c03d277..432cbe19b 100644 --- a/admin/src/pages/Creation.spec.js +++ b/admin/src/pages/Creation.spec.js @@ -71,8 +71,10 @@ describe('Creation', () => { searchText: '', currentPage: 1, pageSize: 25, - filterByActivated: true, - filterByDeleted: false, + filters: { + filterByActivated: true, + filterByDeleted: false, + }, }, }), ) @@ -271,8 +273,10 @@ describe('Creation', () => { searchText: 'XX', currentPage: 1, pageSize: 25, - filterByActivated: true, - filterByDeleted: false, + filters: { + filterByActivated: true, + filterByDeleted: false, + }, }, }), ) @@ -288,8 +292,10 @@ describe('Creation', () => { searchText: '', currentPage: 1, pageSize: 25, - filterByActivated: true, - filterByDeleted: false, + filters: { + filterByActivated: true, + filterByDeleted: false, + }, }, }), ) @@ -305,8 +311,10 @@ describe('Creation', () => { searchText: '', currentPage: 2, pageSize: 25, - filterByActivated: true, - filterByDeleted: false, + filters: { + filterByActivated: true, + filterByDeleted: false, + }, }, }), ) diff --git a/admin/src/pages/Creation.vue b/admin/src/pages/Creation.vue index 54bc0d735..17962bfff 100644 --- a/admin/src/pages/Creation.vue +++ b/admin/src/pages/Creation.vue @@ -102,8 +102,10 @@ export default { searchText: this.criteria, currentPage: this.currentPage, pageSize: this.perPage, - filterByActivated: true, - filterByDeleted: false, + filters: { + filterByActivated: true, + filterByDeleted: false, + }, }, fetchPolicy: 'network-only', }) diff --git a/admin/src/pages/UserSearch.spec.js b/admin/src/pages/UserSearch.spec.js index 2eb24f84b..a1d809a66 100644 --- a/admin/src/pages/UserSearch.spec.js +++ b/admin/src/pages/UserSearch.spec.js @@ -7,7 +7,7 @@ const localVue = global.localVue const apolloQueryMock = jest.fn().mockResolvedValue({ data: { searchUsers: { - userCount: 1, + userCount: 4, userList: [ { userId: 1, @@ -82,8 +82,10 @@ describe('UserSearch', () => { searchText: '', currentPage: 1, pageSize: 25, - filterByActivated: null, - filterByDeleted: null, + filters: { + filterByActivated: null, + filterByDeleted: null, + }, }, }), ) @@ -101,8 +103,10 @@ describe('UserSearch', () => { searchText: '', currentPage: 1, pageSize: 25, - filterByActivated: false, - filterByDeleted: null, + filters: { + filterByActivated: false, + filterByDeleted: null, + }, }, }), ) @@ -121,8 +125,10 @@ describe('UserSearch', () => { searchText: '', currentPage: 1, pageSize: 25, - filterByActivated: null, - filterByDeleted: true, + filters: { + filterByActivated: null, + filterByDeleted: true, + }, }, }), ) @@ -141,8 +147,10 @@ describe('UserSearch', () => { searchText: '', currentPage: 2, pageSize: 25, - filterByActivated: null, - filterByDeleted: null, + filters: { + filterByActivated: null, + filterByDeleted: null, + }, }, }), ) @@ -161,8 +169,10 @@ describe('UserSearch', () => { searchText: 'search string', currentPage: 1, pageSize: 25, - filterByActivated: null, - filterByDeleted: null, + filters: { + filterByActivated: null, + filterByDeleted: null, + }, }, }), ) @@ -178,8 +188,10 @@ describe('UserSearch', () => { searchText: '', currentPage: 1, pageSize: 25, - filterByActivated: null, - filterByDeleted: null, + filters: { + filterByActivated: null, + filterByDeleted: null, + }, }, }), ) diff --git a/admin/src/pages/UserSearch.vue b/admin/src/pages/UserSearch.vue index f8ceac36c..7b638c316 100644 --- a/admin/src/pages/UserSearch.vue +++ b/admin/src/pages/UserSearch.vue @@ -97,8 +97,10 @@ export default { searchText: this.criteria, currentPage: this.currentPage, pageSize: this.perPage, - filterByActivated: this.filterByActivated, - filterByDeleted: this.filterByDeleted, + filters: { + filterByActivated: this.filterByActivated, + filterByDeleted: this.filterByDeleted, + }, }, fetchPolicy: 'no-cache', }) diff --git a/backend/.env.dist b/backend/.env.dist index de33a7272..62b786456 100644 --- a/backend/.env.dist +++ b/backend/.env.dist @@ -49,4 +49,8 @@ EMAIL_CODE_VALID_TIME=1440 EMAIL_CODE_REQUEST_TIME=10 # Webhook -WEBHOOK_ELOPAGE_SECRET=secret \ No newline at end of file +WEBHOOK_ELOPAGE_SECRET=secret + +# SET LOG LEVEL AS NEEDED IN YOUR .ENV +# POSSIBLE VALUES: all | trace | debug | info | warn | error | fatal +# LOG_LEVEL=info diff --git a/backend/.env.template b/backend/.env.template index 8ce8fca4e..140ec67e9 100644 --- a/backend/.env.template +++ b/backend/.env.template @@ -47,4 +47,4 @@ EMAIL_CODE_VALID_TIME=$EMAIL_CODE_VALID_TIME EMAIL_CODE_REQUEST_TIME=$EMAIL_CODE_REQUEST_TIME # Webhook -WEBHOOK_ELOPAGE_SECRET=$WEBHOOK_ELOPAGE_SECRET \ No newline at end of file +WEBHOOK_ELOPAGE_SECRET=$WEBHOOK_ELOPAGE_SECRET diff --git a/backend/.prettierrc.js b/backend/.prettierrc.js index 8495e3f20..bc1d767d7 100644 --- a/backend/.prettierrc.js +++ b/backend/.prettierrc.js @@ -5,4 +5,5 @@ module.exports = { trailingComma: "all", tabWidth: 2, bracketSpacing: true, + endOfLine: "auto", }; diff --git a/backend/log4js-config.json b/backend/log4js-config.json new file mode 100644 index 000000000..451da56ab --- /dev/null +++ b/backend/log4js-config.json @@ -0,0 +1,102 @@ +{ + "appenders": + { + "access": + { + "type": "dateFile", + "filename": "../logs/backend/access.log", + "pattern": "%d{ISO8601} %p %c %X{user} %f:%l %m", + "keepFileExt" : true, + "fileNameSep" : "_" + }, + "apollo": + { + "type": "dateFile", + "filename": "../logs/backend/apollo.log", + "pattern": "%d{ISO8601} %p %c %m", + "keepFileExt" : true, + "fileNameSep" : "_" + }, + "backend": + { + "type": "dateFile", + "filename": "../logs/backend/backend.log", + "pattern": "%d{ISO8601} %p %c %X{user} %f:%l %m", + "keepFileExt" : true, + "fileNameSep" : "_" + }, + "errorFile": + { + "type": "dateFile", + "filename": "../logs/backend/errors.log", + "pattern": "%d{ISO8601} %p %c %X{user} %f:%l %m", + "keepFileExt" : true, + "fileNameSep" : "_" + }, + "errors": + { + "type": "logLevelFilter", + "level": "error", + "appender": "errorFile" + }, + "out": + { + "type": "stdout", + "layout": + { + "type": "pattern", "pattern": "%d{ISO8601} %p %c %X{user} %f:%l %m" + } + }, + "apolloOut": + { + "type": "stdout", + "layout": + { + "type": "pattern", "pattern": "%d{ISO8601} %p %c %m" + } + } + }, + "categories": + { + "default": + { + "appenders": + [ + "out", + "errors" + ], + "level": "debug", + "enableCallStack": true + }, + "apollo": + { + "appenders": + [ + "apollo", + "apolloOut", + "errors" + ], + "level": "debug", + "enableCallStack": true + }, + "backend": + { + "appenders": + [ + "backend", + "out", + "errors" + ], + "level": "debug", + "enableCallStack": true + }, + "http": + { + "appenders": + [ + "access" + ], + "level": "info" + } + } +} diff --git a/backend/package.json b/backend/package.json index a9febe918..ff483a0c6 100644 --- a/backend/package.json +++ b/backend/package.json @@ -5,7 +5,7 @@ "main": "src/index.ts", "repository": "https://github.com/gradido/gradido/backend", "author": "Ulf Gebhardt", - "license": "MIT", + "license": "Apache-2.0", "private": false, "scripts": { "build": "tsc --build", @@ -19,7 +19,6 @@ "dependencies": { "@types/jest": "^27.0.2", "@types/lodash.clonedeep": "^4.5.6", - "apollo-log": "^1.1.0", "apollo-server-express": "^2.25.2", "apollo-server-testing": "^2.25.2", "axios": "^0.21.1", @@ -33,6 +32,7 @@ "jest": "^27.2.4", "jsonwebtoken": "^8.5.1", "lodash.clonedeep": "^4.5.0", + "log4js": "^6.4.6", "mysql2": "^2.3.0", "nodemailer": "^6.6.5", "random-bigint": "^0.0.1", diff --git a/backend/src/apis/HttpRequest.ts b/backend/src/apis/HttpRequest.ts index c1f99dc46..4039e3a98 100644 --- a/backend/src/apis/HttpRequest.ts +++ b/backend/src/apis/HttpRequest.ts @@ -1,10 +1,14 @@ import axios from 'axios' +import { backendLogger as logger } from '@/server/logger' + // eslint-disable-next-line @typescript-eslint/no-explicit-any export const apiPost = async (url: string, payload: unknown): Promise => { + logger.trace('POST: url=' + url + ' payload=' + payload) return axios .post(url, payload) .then((result) => { + logger.trace('POST-Response: result=' + result) if (result.status !== 200) { throw new Error('HTTP Status Error ' + result.status) } @@ -20,9 +24,11 @@ export const apiPost = async (url: string, payload: unknown): Promise => { // eslint-disable-next-line @typescript-eslint/no-explicit-any export const apiGet = async (url: string): Promise => { + logger.trace('GET: url=' + url) return axios .get(url) .then((result) => { + logger.trace('GET-Response: result=' + result) if (result.status !== 200) { throw new Error('HTTP Status Error ' + result.status) } diff --git a/backend/src/config/index.test.ts b/backend/src/config/index.test.ts index 3c4c7865e..1dabf9292 100644 --- a/backend/src/config/index.test.ts +++ b/backend/src/config/index.test.ts @@ -3,7 +3,7 @@ import CONFIG from './index' describe('config/index', () => { describe('decay start block', () => { it('has the correct date set', () => { - expect(CONFIG.DECAY_START_TIME).toEqual(new Date('2021-05-13 17:46:31')) + expect(CONFIG.DECAY_START_TIME).toEqual(new Date('2021-05-13 17:46:31-0000')) }) }) }) diff --git a/backend/src/config/index.ts b/backend/src/config/index.ts index eed4278d2..28318ed6b 100644 --- a/backend/src/config/index.ts +++ b/backend/src/config/index.ts @@ -11,7 +11,10 @@ Decimal.set({ const constants = { DB_VERSION: '0036-unique_previous_in_transactions', - DECAY_START_TIME: new Date('2021-05-13 17:46:31'), // GMT+0 + DECAY_START_TIME: new Date('2021-05-13 17:46:31-0000'), // GMT+0 + LOG4JS_CONFIG: 'log4js-config.json', + // default log level on production should be info + LOG_LEVEL: process.env.LOG_LEVEL || 'info', CONFIG_VERSION: { DEFAULT: 'DEFAULT', EXPECTED: 'v6.2022-04-21', diff --git a/backend/src/graphql/arg/SearchUsersArgs.ts b/backend/src/graphql/arg/SearchUsersArgs.ts index b47f39d56..8db6bfc06 100644 --- a/backend/src/graphql/arg/SearchUsersArgs.ts +++ b/backend/src/graphql/arg/SearchUsersArgs.ts @@ -1,4 +1,5 @@ import { ArgsType, Field, Int } from 'type-graphql' +import SearchUsersFilters from '@arg/SearchUsersFilters' @ArgsType() export default class SearchUsersArgs { @@ -11,9 +12,6 @@ export default class SearchUsersArgs { @Field(() => Int, { nullable: true }) pageSize?: number - @Field(() => Boolean, { nullable: true }) - filterByActivated?: boolean | null - - @Field(() => Boolean, { nullable: true }) - filterByDeleted?: boolean | null + @Field(() => SearchUsersFilters, { nullable: true }) + filters: SearchUsersFilters } diff --git a/backend/src/graphql/arg/SearchUsersFilters.ts b/backend/src/graphql/arg/SearchUsersFilters.ts new file mode 100644 index 000000000..de7c7c20a --- /dev/null +++ b/backend/src/graphql/arg/SearchUsersFilters.ts @@ -0,0 +1,11 @@ +import { Field, InputType, ObjectType } from 'type-graphql' + +@ObjectType() +@InputType('SearchUsersFiltersInput') +export default class SearchUsersFilters { + @Field(() => Boolean, { nullable: true, defaultValue: null }) + filterByActivated?: boolean | null + + @Field(() => Boolean, { nullable: true, defaultValue: null }) + filterByDeleted?: boolean | null +} diff --git a/backend/src/graphql/arg/TransactionLinkFilters.ts b/backend/src/graphql/arg/TransactionLinkFilters.ts index e2f752d3f..b009a3180 100644 --- a/backend/src/graphql/arg/TransactionLinkFilters.ts +++ b/backend/src/graphql/arg/TransactionLinkFilters.ts @@ -3,11 +3,11 @@ import { ArgsType, Field } from 'type-graphql' @ArgsType() export default class TransactionLinkFilters { @Field(() => Boolean, { nullable: true, defaultValue: true }) - withDeleted?: boolean + filterByDeleted?: boolean @Field(() => Boolean, { nullable: true, defaultValue: true }) - withExpired?: boolean + filterByExpired?: boolean @Field(() => Boolean, { nullable: true, defaultValue: true }) - withRedeemed?: boolean + filterByRedeemed?: boolean } diff --git a/backend/src/graphql/arg/UpdateUserInfosArgs.ts b/backend/src/graphql/arg/UpdateUserInfosArgs.ts index d1e95ebef..81c07a329 100644 --- a/backend/src/graphql/arg/UpdateUserInfosArgs.ts +++ b/backend/src/graphql/arg/UpdateUserInfosArgs.ts @@ -19,7 +19,4 @@ export default class UpdateUserInfosArgs { @Field({ nullable: true }) passwordNew?: string - - @Field({ nullable: true }) - coinanimation?: boolean } diff --git a/backend/src/graphql/enum/Setting.ts b/backend/src/graphql/enum/Setting.ts deleted file mode 100644 index 8efeec72d..000000000 --- a/backend/src/graphql/enum/Setting.ts +++ /dev/null @@ -1,5 +0,0 @@ -enum Setting { - COIN_ANIMATION = 'coinanimation', -} - -export { Setting } diff --git a/backend/src/graphql/model/User.ts b/backend/src/graphql/model/User.ts index 4f577f60a..86c56312f 100644 --- a/backend/src/graphql/model/User.ts +++ b/backend/src/graphql/model/User.ts @@ -15,8 +15,6 @@ export class User { this.language = user.language this.publisherId = user.publisherId this.isAdmin = user.isAdmin - // TODO - this.coinanimation = null this.klickTipp = null this.hasElopage = null } @@ -61,11 +59,6 @@ export class User { @Field(() => Date, { nullable: true }) isAdmin: Date | null - // TODO this is a bit inconsistent with what we query from the database - // therefore all those fields are now nullable with default value null - @Field(() => Boolean, { nullable: true }) - coinanimation: boolean | null - @Field(() => KlickTipp, { nullable: true }) klickTipp: KlickTipp | null diff --git a/backend/src/graphql/resolver/AdminResolver.test.ts b/backend/src/graphql/resolver/AdminResolver.test.ts index ca6bf0fe7..4771232ea 100644 --- a/backend/src/graphql/resolver/AdminResolver.test.ts +++ b/backend/src/graphql/resolver/AdminResolver.test.ts @@ -1,6 +1,7 @@ /* eslint-disable @typescript-eslint/no-explicit-any */ /* eslint-disable @typescript-eslint/explicit-module-boundary-types */ +import { convertObjValuesToArray } from '@/util/utilities' import { testEnvironment, resetToken, cleanDB } from '@test/helpers' import { userFactory } from '@/seeds/factory/user' import { creationFactory } from '@/seeds/factory/creation' @@ -11,6 +12,7 @@ import { garrickOllivander } from '@/seeds/users/garrick-ollivander' import { deleteUser, unDeleteUser, + searchUsers, createPendingCreation, createPendingCreations, updatePendingCreation, @@ -261,6 +263,224 @@ describe('AdminResolver', () => { }) }) + describe('search users', () => { + const variablesWithoutTextAndFilters = { + searchText: '', + currentPage: 1, + pageSize: 25, + filters: null, + } + + describe('unauthenticated', () => { + it('returns an error', async () => { + await expect( + query({ + query: searchUsers, + variables: { + ...variablesWithoutTextAndFilters, + }, + }), + ).resolves.toEqual( + expect.objectContaining({ + errors: [new GraphQLError('401 Unauthorized')], + }), + ) + }) + }) + + describe('authenticated', () => { + describe('without admin rights', () => { + beforeAll(async () => { + user = await userFactory(testEnv, bibiBloxberg) + await query({ + query: login, + variables: { email: 'bibi@bloxberg.de', password: 'Aa12345_' }, + }) + }) + + afterAll(async () => { + await cleanDB() + resetToken() + }) + + it('returns an error', async () => { + await expect( + query({ + query: searchUsers, + variables: { + ...variablesWithoutTextAndFilters, + }, + }), + ).resolves.toEqual( + expect.objectContaining({ + errors: [new GraphQLError('401 Unauthorized')], + }), + ) + }) + }) + + describe('with admin rights', () => { + const allUsers = { + bibi: expect.objectContaining({ + email: 'bibi@bloxberg.de', + }), + garrick: expect.objectContaining({ + email: 'garrick@ollivander.com', + }), + peter: expect.objectContaining({ + email: 'peter@lustig.de', + }), + stephen: expect.objectContaining({ + email: 'stephen@hawking.uk', + }), + } + + beforeAll(async () => { + admin = await userFactory(testEnv, peterLustig) + await query({ + query: login, + variables: { email: 'peter@lustig.de', password: 'Aa12345_' }, + }) + + await userFactory(testEnv, bibiBloxberg) + await userFactory(testEnv, stephenHawking) + await userFactory(testEnv, garrickOllivander) + }) + + afterAll(async () => { + await cleanDB() + resetToken() + }) + + describe('without any filters', () => { + it('finds all users', async () => { + await expect( + query({ + query: searchUsers, + variables: { + ...variablesWithoutTextAndFilters, + }, + }), + ).resolves.toEqual( + expect.objectContaining({ + data: { + searchUsers: { + userCount: 4, + userList: expect.arrayContaining(convertObjValuesToArray(allUsers)), + }, + }, + }), + ) + }) + }) + + describe('all filters are null', () => { + it('finds all users', async () => { + await expect( + query({ + query: searchUsers, + variables: { + ...variablesWithoutTextAndFilters, + filters: { + filterByActivated: null, + filterByDeleted: null, + }, + }, + }), + ).resolves.toEqual( + expect.objectContaining({ + data: { + searchUsers: { + userCount: 4, + userList: expect.arrayContaining(convertObjValuesToArray(allUsers)), + }, + }, + }), + ) + }) + }) + + describe('filter by unchecked email', () => { + it('finds only users with unchecked email', async () => { + await expect( + query({ + query: searchUsers, + variables: { + ...variablesWithoutTextAndFilters, + filters: { + filterByActivated: false, + filterByDeleted: null, + }, + }, + }), + ).resolves.toEqual( + expect.objectContaining({ + data: { + searchUsers: { + userCount: 1, + userList: expect.arrayContaining([allUsers.garrick]), + }, + }, + }), + ) + }) + }) + + describe('filter by deleted users', () => { + it('finds only users with deleted account', async () => { + await expect( + query({ + query: searchUsers, + variables: { + ...variablesWithoutTextAndFilters, + filters: { + filterByActivated: null, + filterByDeleted: true, + }, + }, + }), + ).resolves.toEqual( + expect.objectContaining({ + data: { + searchUsers: { + userCount: 1, + userList: expect.arrayContaining([allUsers.stephen]), + }, + }, + }), + ) + }) + }) + + describe('filter by deleted account and unchecked email', () => { + it('finds no users', async () => { + await expect( + query({ + query: searchUsers, + variables: { + ...variablesWithoutTextAndFilters, + filters: { + filterByActivated: false, + filterByDeleted: true, + }, + }, + }), + ).resolves.toEqual( + expect.objectContaining({ + data: { + searchUsers: { + userCount: 0, + userList: [], + }, + }, + }), + ) + }) + }) + }) + }) + }) + describe('creations', () => { const variables = { email: 'bibi@bloxberg.de', diff --git a/backend/src/graphql/resolver/AdminResolver.ts b/backend/src/graphql/resolver/AdminResolver.ts index 8da92a61c..8c3d71b73 100644 --- a/backend/src/graphql/resolver/AdminResolver.ts +++ b/backend/src/graphql/resolver/AdminResolver.ts @@ -52,23 +52,19 @@ export class AdminResolver { @Query(() => SearchUsersResult) async searchUsers( @Args() - { - searchText, - currentPage = 1, - pageSize = 25, - filterByActivated = null, - filterByDeleted = null, - }: SearchUsersArgs, + { searchText, currentPage = 1, pageSize = 25, filters }: SearchUsersArgs, ): Promise { const userRepository = getCustomRepository(UserRepository) const filterCriteria: ObjectLiteral[] = [] - if (filterByActivated !== null) { - filterCriteria.push({ emailChecked: filterByActivated }) - } + if (filters) { + if (filters.filterByActivated !== null) { + filterCriteria.push({ emailChecked: filters.filterByActivated }) + } - if (filterByDeleted !== null) { - filterCriteria.push({ deletedAt: filterByDeleted ? Not(IsNull()) : IsNull() }) + if (filters.filterByDeleted !== null) { + filterCriteria.push({ deletedAt: filters.filterByDeleted ? Not(IsNull()) : IsNull() }) + } } const userFields = ['id', 'firstName', 'lastName', 'email', 'emailChecked', 'deletedAt'] @@ -442,11 +438,11 @@ export class AdminResolver { } = { userId, } - if (!filters.withRedeemed) where.redeemedBy = null - if (!filters.withExpired) where.validUntil = MoreThan(new Date()) + if (!filters.filterByRedeemed) where.redeemedBy = null + if (!filters.filterByExpired) where.validUntil = MoreThan(new Date()) const [transactionLinks, count] = await dbTransactionLink.findAndCount({ where, - withDeleted: filters.withDeleted, + withDeleted: filters.filterByDeleted, order: { createdAt: order, }, diff --git a/backend/src/graphql/resolver/BalanceResolver.ts b/backend/src/graphql/resolver/BalanceResolver.ts index 9df164960..176b45354 100644 --- a/backend/src/graphql/resolver/BalanceResolver.ts +++ b/backend/src/graphql/resolver/BalanceResolver.ts @@ -1,3 +1,5 @@ +import { backendLogger as logger } from '@/server/logger' + import { Context, getUser } from '@/server/context' import { Resolver, Query, Ctx, Authorized } from 'type-graphql' import { Balance } from '@model/Balance' @@ -18,15 +20,22 @@ export class BalanceResolver { const user = getUser(context) const now = new Date() + logger.addContext('user', user.id) + logger.info(`balance(userId=${user.id})...`) + const gdtResolver = new GdtResolver() const balanceGDT = await gdtResolver.gdtBalance(context) + logger.debug(`balanceGDT=${balanceGDT}`) const lastTransaction = context.lastTransaction ? context.lastTransaction : await dbTransaction.findOne({ userId: user.id }, { order: { balanceDate: 'DESC' } }) + logger.debug(`lastTransaction=${lastTransaction}`) + // No balance found if (!lastTransaction) { + logger.info(`no balance found, return Default-Balance!`) return new Balance({ balance: new Decimal(0), balanceGDT, @@ -39,6 +48,8 @@ export class BalanceResolver { context.transactionCount || context.transactionCount === 0 ? context.transactionCount : await dbTransaction.count({ where: { userId: user.id } }) + logger.debug(`transactionCount=${count}`) + const linkCount = await dbTransactionLink.count({ where: { userId: user.id, @@ -46,6 +57,7 @@ export class BalanceResolver { // validUntil: MoreThan(new Date()), }, }) + logger.debug(`linkCount=${linkCount}`) // The decay is always calculated on the last booked transaction const calculatedDecay = calculateDecay( @@ -53,6 +65,9 @@ export class BalanceResolver { lastTransaction.balanceDate, now, ) + logger.info( + `calculatedDecay(balance=${lastTransaction.balance}, balanceDate=${lastTransaction.balanceDate})=${calculatedDecay}`, + ) // The final balance is reduced by the link amount withheld const transactionLinkRepository = getCustomRepository(TransactionLinkRepository) @@ -60,13 +75,27 @@ export class BalanceResolver { ? { sumHoldAvailableAmount: context.sumHoldAvailableAmount } : await transactionLinkRepository.summary(user.id, now) - return new Balance({ - balance: calculatedDecay.balance - .minus(sumHoldAvailableAmount.toString()) - .toDecimalPlaces(2, Decimal.ROUND_DOWN), // round towards zero + logger.debug(`context.sumHoldAvailableAmount=${context.sumHoldAvailableAmount}`) + logger.debug(`sumHoldAvailableAmount=${sumHoldAvailableAmount}`) + + const balance = calculatedDecay.balance + .minus(sumHoldAvailableAmount.toString()) + .toDecimalPlaces(2, Decimal.ROUND_DOWN) // round towards zero + + // const newBalance = new Balance({ + // balance: calculatedDecay.balance + // .minus(sumHoldAvailableAmount.toString()) + // .toDecimalPlaces(2, Decimal.ROUND_DOWN), + const newBalance = new Balance({ + balance, balanceGDT, count, linkCount, }) + logger.info( + `new Balance(balance=${balance}, balanceGDT=${balanceGDT}, count=${count}, linkCount=${linkCount}) = ${newBalance}`, + ) + + return newBalance } } diff --git a/backend/src/graphql/resolver/TransactionResolver.ts b/backend/src/graphql/resolver/TransactionResolver.ts index 69e1899d9..023e5b2ff 100644 --- a/backend/src/graphql/resolver/TransactionResolver.ts +++ b/backend/src/graphql/resolver/TransactionResolver.ts @@ -1,6 +1,7 @@ /* eslint-disable new-cap */ /* eslint-disable @typescript-eslint/no-non-null-assertion */ +import { backendLogger as logger } from '@/server/logger' import CONFIG from '@/config' import { Context, getUser } from '@/server/context' @@ -44,15 +45,22 @@ export const executeTransaction = async ( recipient: dbUser, transactionLink?: dbTransactionLink | null, ): Promise => { + logger.info( + `executeTransaction(amount=${amount}, memo=${memo}, sender=${sender}, recipient=${recipient})...`, + ) + if (sender.id === recipient.id) { + logger.error(`Sender and Recipient are the same.`) throw new Error('Sender and Recipient are the same.') } if (memo.length > MEMO_MAX_CHARS) { + logger.error(`memo text is too long: memo.length=${memo.length} > (${MEMO_MAX_CHARS}`) throw new Error(`memo text is too long (${MEMO_MAX_CHARS} characters maximum)`) } if (memo.length < MEMO_MIN_CHARS) { + logger.error(`memo text is too short: memo.length=${memo.length} < (${MEMO_MIN_CHARS}`) throw new Error(`memo text is too short (${MEMO_MIN_CHARS} characters minimum)`) } @@ -64,13 +72,16 @@ export const executeTransaction = async ( receivedCallDate, transactionLink, ) + logger.debug(`calculated Balance=${sendBalance}`) if (!sendBalance) { + logger.error(`user hasn't enough GDD or amount is < 0 : balance=${sendBalance}`) throw new Error("user hasn't enough GDD or amount is < 0") } const queryRunner = getConnection().createQueryRunner() await queryRunner.connect() await queryRunner.startTransaction('READ UNCOMMITTED') + logger.debug(`open Transaction to write...`) try { // transaction const transactionSend = new dbTransaction() @@ -87,6 +98,8 @@ export const executeTransaction = async ( transactionSend.transactionLinkId = transactionLink ? transactionLink.id : null await queryRunner.manager.insert(dbTransaction, transactionSend) + logger.debug(`sendTransaction inserted: ${dbTransaction}`) + const transactionReceive = new dbTransaction() transactionReceive.typeId = TransactionTypeId.RECEIVE transactionReceive.memo = memo @@ -102,12 +115,15 @@ export const executeTransaction = async ( transactionReceive.linkedTransactionId = transactionSend.id transactionReceive.transactionLinkId = transactionLink ? transactionLink.id : null await queryRunner.manager.insert(dbTransaction, transactionReceive) + logger.debug(`receive Transaction inserted: ${dbTransaction}`) // Save linked transaction id for send transactionSend.linkedTransactionId = transactionReceive.id await queryRunner.manager.update(dbTransaction, { id: transactionSend.id }, transactionSend) + logger.debug(`send Transaction updated: ${transactionSend}`) if (transactionLink) { + logger.info(`transactionLink: ${transactionLink}`) transactionLink.redeemedAt = receivedCallDate transactionLink.redeemedBy = recipient.id await queryRunner.manager.update( @@ -118,13 +134,15 @@ export const executeTransaction = async ( } await queryRunner.commitTransaction() + logger.info(`commit Transaction successful...`) } catch (e) { await queryRunner.rollbackTransaction() + logger.error(`Transaction was not successful: ${e}`) throw new Error(`Transaction was not successful: ${e}`) } finally { await queryRunner.release() } - + logger.debug(`prepare Email for transaction received...`) // send notification email // TODO: translate await sendTransactionReceivedEmail({ @@ -138,7 +156,7 @@ export const executeTransaction = async ( memo, overviewURL: CONFIG.EMAIL_LINK_OVERVIEW, }) - + logger.info(`finished executeTransaction successfully`) return true } @@ -154,16 +172,21 @@ export class TransactionResolver { const now = new Date() const user = getUser(context) + logger.addContext('user', user.id) + logger.info(`transactionList(user=${user.firstName}.${user.lastName}, ${user.email})`) + // find current balance const lastTransaction = await dbTransaction.findOne( { userId: user.id }, { order: { balanceDate: 'DESC' } }, ) + logger.debug(`lastTransaction=${lastTransaction}`) const balanceResolver = new BalanceResolver() context.lastTransaction = lastTransaction if (!lastTransaction) { + logger.info('no lastTransaction') return new TransactionList(await balanceResolver.balance(context), []) } @@ -186,6 +209,8 @@ export class TransactionResolver { involvedUserIds.push(transaction.linkedUserId) } }) + logger.debug(`involvedUserIds=${involvedUserIds}`) + // We need to show the name for deleted users for old transactions const involvedDbUsers = await dbUser .createQueryBuilder() @@ -193,6 +218,7 @@ export class TransactionResolver { .where('id IN (:...userIds)', { userIds: involvedUserIds }) .getMany() const involvedUsers = involvedDbUsers.map((u) => new User(u)) + logger.debug(`involvedUsers=${involvedUsers}`) const self = new User(user) const transactions: Transaction[] = [] @@ -201,10 +227,13 @@ export class TransactionResolver { const { sumHoldAvailableAmount, sumAmount, lastDate, firstDate, transactionLinkcount } = await transactionLinkRepository.summary(user.id, now) context.linkCount = transactionLinkcount + logger.debug(`transactionLinkcount=${transactionLinkcount}`) context.sumHoldAvailableAmount = sumHoldAvailableAmount + logger.debug(`sumHoldAvailableAmount=${sumHoldAvailableAmount}`) // decay & link transactions if (currentPage === 1 && order === Order.DESC) { + logger.debug(`currentPage == 1: transactions=${transactions}`) // The virtual decay is always on the booked amount, not including the generated, not yet booked links, // since the decay is substantially different when the amount is less transactions.push( @@ -216,8 +245,11 @@ export class TransactionResolver { sumHoldAvailableAmount, ), ) + logger.debug(`transactions=${transactions}`) + // virtual transaction for pending transaction-links sum if (sumHoldAvailableAmount.greaterThan(0)) { + logger.debug(`sumHoldAvailableAmount > 0: transactions=${transactions}`) transactions.push( virtualLinkTransaction( lastTransaction.balance.minus(sumHoldAvailableAmount.toString()), @@ -229,6 +261,7 @@ export class TransactionResolver { self, ), ) + logger.debug(`transactions=${transactions}`) } } @@ -240,6 +273,7 @@ export class TransactionResolver { : involvedUsers.find((u) => u.id === userTransaction.linkedUserId) transactions.push(new Transaction(userTransaction, self, linkedUser)) }) + logger.debug(`TransactionTypeId.CREATION: transactions=${transactions}`) // Construct Result return new TransactionList(await balanceResolver.balance(context), transactions) @@ -251,29 +285,38 @@ export class TransactionResolver { @Args() { email, amount, memo }: TransactionSendArgs, @Ctx() context: Context, ): Promise { + logger.info(`sendCoins(email=${email}, amount=${amount}, memo=${memo})`) + // TODO this is subject to replay attacks const senderUser = getUser(context) if (senderUser.pubKey.length !== 32) { + logger.error(`invalid sender public key:${senderUser.pubKey}`) throw new Error('invalid sender public key') } // validate recipient user const recipientUser = await dbUser.findOne({ email: email }, { withDeleted: true }) if (!recipientUser) { + logger.error(`recipient not known: email=${email}`) throw new Error('recipient not known') } if (recipientUser.deletedAt) { + logger.error(`The recipient account was deleted: recipientUser=${recipientUser}`) throw new Error('The recipient account was deleted') } if (!recipientUser.emailChecked) { + logger.error(`The recipient account is not activated: recipientUser=${recipientUser}`) throw new Error('The recipient account is not activated') } if (!isHexPublicKey(recipientUser.pubKey.toString('hex'))) { + logger.error(`invalid recipient public key: recipientUser=${recipientUser}`) throw new Error('invalid recipient public key') } await executeTransaction(amount, memo, senderUser, recipientUser) - + logger.info( + `successful executeTransaction(amount=${amount}, memo=${memo}, senderUser=${senderUser}, recipientUser=${recipientUser})`, + ) return true } } diff --git a/backend/src/graphql/resolver/UserResolver.test.ts b/backend/src/graphql/resolver/UserResolver.test.ts index c658476a4..78b630834 100644 --- a/backend/src/graphql/resolver/UserResolver.test.ts +++ b/backend/src/graphql/resolver/UserResolver.test.ts @@ -14,6 +14,8 @@ import { sendAccountActivationEmail } from '@/mailer/sendAccountActivationEmail' import { sendResetPasswordEmail } from '@/mailer/sendResetPasswordEmail' import { printTimeDuration, activationLink } from './UserResolver' +import { logger } from '@test/testSetup' + // import { klicktippSignIn } from '@/apis/KlicktippController' jest.mock('@/mailer/sendAccountActivationEmail', () => { @@ -43,7 +45,7 @@ let mutate: any, query: any, con: any let testEnv: any beforeAll(async () => { - testEnv = await testEnvironment() + testEnv = await testEnvironment(logger) mutate = testEnv.mutate query = testEnv.query con = testEnv.con @@ -149,12 +151,14 @@ describe('UserResolver', () => { }) describe('email already exists', () => { - it('throws an error', async () => { - await expect(mutate({ mutation: createUser, variables })).resolves.toEqual( + it('throws and logs an error', async () => { + const mutation = await mutate({ mutation: createUser, variables }) + expect(mutation).toEqual( expect.objectContaining({ errors: [new GraphQLError('User already exists.')], }), ) + expect(logger.error).toBeCalledWith('User already exists with this email=peter@lustig.de') }) }) @@ -340,7 +344,6 @@ describe('UserResolver', () => { expect.objectContaining({ data: { login: { - coinanimation: true, email: 'bibi@bloxberg.de', firstName: 'Bibi', hasElopage: false, @@ -475,7 +478,6 @@ describe('UserResolver', () => { firstName: 'Bibi', lastName: 'Bloxberg', language: 'de', - coinanimation: true, klickTipp: { newsletterState: false, }, diff --git a/backend/src/graphql/resolver/UserResolver.ts b/backend/src/graphql/resolver/UserResolver.ts index 4ab5a901b..9b42d76b5 100644 --- a/backend/src/graphql/resolver/UserResolver.ts +++ b/backend/src/graphql/resolver/UserResolver.ts @@ -1,7 +1,9 @@ import fs from 'fs' +import { backendLogger as logger } from '@/server/logger' + import { Context, getUser } from '@/server/context' import { Resolver, Query, Args, Arg, Authorized, Ctx, UseMiddleware, Mutation } from 'type-graphql' -import { getConnection, getCustomRepository } from '@dbTools/typeorm' +import { getConnection } from '@dbTools/typeorm' import CONFIG from '@/config' import { User } from '@model/User' import { User as DbUser } from '@entity/User' @@ -11,8 +13,6 @@ import CreateUserArgs from '@arg/CreateUserArgs' import UnsecureLoginArgs from '@arg/UnsecureLoginArgs' import UpdateUserInfosArgs from '@arg/UpdateUserInfosArgs' import { klicktippNewsletterStateMiddleware } from '@/middleware/klicktippMiddleware' -import { UserSettingRepository } from '@repository/UserSettingRepository' -import { Setting } from '@enum/Setting' import { OptInType } from '@enum/OptInType' import { LoginEmailOptIn } from '@entity/LoginEmailOptIn' import { sendResetPasswordEmail as sendResetPasswordEmailMailer } from '@/mailer/sendResetPasswordEmail' @@ -43,6 +43,7 @@ const WORDS = fs .toString() .split(',') const PassphraseGenerate = (): string[] => { + logger.trace('PassphraseGenerate...') const result = [] for (let i = 0; i < PHRASE_WORD_COUNT; i++) { result.push(WORDS[sodium.randombytes_random() % 2048]) @@ -51,7 +52,9 @@ const PassphraseGenerate = (): string[] => { } const KeyPairEd25519Create = (passphrase: string[]): Buffer[] => { + logger.trace('KeyPairEd25519Create...') if (!passphrase.length || passphrase.length < PHRASE_WORD_COUNT) { + logger.error('passphrase empty or to short') throw new Error('passphrase empty or to short') } @@ -79,14 +82,19 @@ const KeyPairEd25519Create = (passphrase: string[]): Buffer[] => { privKey, outputHashBuffer.slice(0, sodium.crypto_sign_SEEDBYTES), ) + logger.debug(`KeyPair creation ready. pubKey=${pubKey}`) return [pubKey, privKey] } const SecretKeyCryptographyCreateKey = (salt: string, password: string): Buffer[] => { + logger.trace('SecretKeyCryptographyCreateKey...') const configLoginAppSecret = Buffer.from(CONFIG.LOGIN_APP_SECRET, 'hex') const configLoginServerKey = Buffer.from(CONFIG.LOGIN_SERVER_KEY, 'hex') if (configLoginServerKey.length !== sodium.crypto_shorthash_KEYBYTES) { + logger.error( + `ServerKey has an invalid size. The size must be ${sodium.crypto_shorthash_KEYBYTES} bytes.`, + ) throw new Error( `ServerKey has an invalid size. The size must be ${sodium.crypto_shorthash_KEYBYTES} bytes.`, ) @@ -115,39 +123,50 @@ const SecretKeyCryptographyCreateKey = (salt: string, password: string): Buffer[ const encryptionKeyHash = Buffer.alloc(sodium.crypto_shorthash_BYTES) sodium.crypto_shorthash(encryptionKeyHash, encryptionKey, configLoginServerKey) + logger.debug( + `SecretKeyCryptographyCreateKey...successful: encryptionKeyHash= ${encryptionKeyHash}, encryptionKey= ${encryptionKey}`, + ) return [encryptionKeyHash, encryptionKey] } const getEmailHash = (email: string): Buffer => { + logger.trace('getEmailHash...') const emailHash = Buffer.alloc(sodium.crypto_generichash_BYTES) sodium.crypto_generichash(emailHash, Buffer.from(email)) + logger.debug(`getEmailHash...successful: ${emailHash}`) return emailHash } const SecretKeyCryptographyEncrypt = (message: Buffer, encryptionKey: Buffer): Buffer => { + logger.trace('SecretKeyCryptographyEncrypt...') const encrypted = Buffer.alloc(message.length + sodium.crypto_secretbox_MACBYTES) const nonce = Buffer.alloc(sodium.crypto_secretbox_NONCEBYTES) nonce.fill(31) // static nonce sodium.crypto_secretbox_easy(encrypted, message, nonce, encryptionKey) + logger.debug(`SecretKeyCryptographyEncrypt...successful: ${encrypted}`) return encrypted } const SecretKeyCryptographyDecrypt = (encryptedMessage: Buffer, encryptionKey: Buffer): Buffer => { + logger.trace('SecretKeyCryptographyDecrypt...') const message = Buffer.alloc(encryptedMessage.length - sodium.crypto_secretbox_MACBYTES) const nonce = Buffer.alloc(sodium.crypto_secretbox_NONCEBYTES) nonce.fill(31) // static nonce sodium.crypto_secretbox_open_easy(message, encryptedMessage, nonce, encryptionKey) + logger.debug(`SecretKeyCryptographyDecrypt...successful: ${message}`) return message } const newEmailOptIn = (userId: number): LoginEmailOptIn => { + logger.trace('newEmailOptIn...') const emailOptIn = new LoginEmailOptIn() emailOptIn.verificationCode = random(64) emailOptIn.userId = userId emailOptIn.emailOptInTypeId = OptInType.EMAIL_OPT_IN_REGISTER + logger.debug(`newEmailOptIn...successful: ${emailOptIn}`) return emailOptIn } @@ -159,8 +178,14 @@ export const checkOptInCode = async ( userId: number, optInType: OptInType = OptInType.EMAIL_OPT_IN_REGISTER, ): Promise => { + logger.info(`checkOptInCode... ${optInCode}`) if (optInCode) { if (!canResendOptIn(optInCode)) { + logger.error( + `email already sent less than ${printTimeDuration( + CONFIG.EMAIL_CODE_REQUEST_TIME, + )} minutes ago`, + ) throw new Error( `email already sent less than ${printTimeDuration( CONFIG.EMAIL_CODE_REQUEST_TIME, @@ -170,16 +195,20 @@ export const checkOptInCode = async ( optInCode.updatedAt = new Date() optInCode.resendCount++ } else { + logger.trace('create new OptIn for userId=' + userId) optInCode = newEmailOptIn(userId) } optInCode.emailOptInTypeId = optInType await LoginEmailOptIn.save(optInCode).catch(() => { + logger.error('Unable to save optin code= ' + optInCode) throw new Error('Unable to save optin code.') }) + logger.debug(`checkOptInCode...successful: ${optInCode} for userid=${userId}`) return optInCode } export const activationLink = (optInCode: LoginEmailOptIn): string => { + logger.debug(`activationLink(${LoginEmailOptIn})...`) return CONFIG.EMAIL_LINK_SETPASSWORD.replace(/{optin}/g, optInCode.verificationCode.toString()) } @@ -189,6 +218,7 @@ export class UserResolver { @Query(() => User) @UseMiddleware(klicktippNewsletterStateMiddleware) async verifyLogin(@Ctx() context: Context): Promise { + logger.info('verifyLogin...') // TODO refactor and do not have duplicate code with login(see below) const userEntity = getUser(context) const user = new User(userEntity) @@ -196,15 +226,7 @@ export class UserResolver { // Elopage Status & Stored PublisherId user.hasElopage = await this.hasElopage(context) - // coinAnimation - const userSettingRepository = getCustomRepository(UserSettingRepository) - const coinanimation = await userSettingRepository - .readBoolean(userEntity.id, Setting.COIN_ANIMATION) - .catch((error) => { - throw new Error(error) - }) - user.coinanimation = coinanimation - + logger.debug(`verifyLogin... successful: ${user.firstName}.${user.lastName}, ${user.email}`) return user } @@ -215,54 +237,57 @@ export class UserResolver { @Args() { email, password, publisherId }: UnsecureLoginArgs, @Ctx() context: Context, ): Promise { + logger.info(`login with ${email}, ***, ${publisherId} ...`) email = email.trim().toLowerCase() const dbUser = await DbUser.findOneOrFail({ email }, { withDeleted: true }).catch(() => { + logger.error(`User with email=${email} does not exists`) throw new Error('No user with this credentials') }) if (dbUser.deletedAt) { + logger.error('The User was permanently deleted in database.') throw new Error('This user was permanently deleted. Contact support for questions.') } if (!dbUser.emailChecked) { + logger.error('The Users email is not validate yet.') throw new Error('User email not validated') } if (dbUser.password === BigInt(0)) { + logger.error('The User has not set a password yet.') // TODO we want to catch this on the frontend and ask the user to check his emails or resend code throw new Error('User has no password set yet') } if (!dbUser.pubKey || !dbUser.privKey) { + logger.error('The User has no private or publicKey.') // TODO we want to catch this on the frontend and ask the user to check his emails or resend code throw new Error('User has no private or publicKey') } const passwordHash = SecretKeyCryptographyCreateKey(email, password) // return short and long hash const loginUserPassword = BigInt(dbUser.password.toString()) if (loginUserPassword !== passwordHash[0].readBigUInt64LE()) { + logger.error('The User has no valid credentials.') throw new Error('No user with this credentials') } + // add pubKey in logger-context for layout-pattern X{user} to print it in each logging message + logger.addContext('user', dbUser.id) + logger.debug('login credentials valid...') const user = new User(dbUser) + logger.debug('user=' + user) // Elopage Status & Stored PublisherId user.hasElopage = await this.hasElopage({ ...context, user: dbUser }) + logger.info('user.hasElopage=' + user.hasElopage) if (!user.hasElopage && publisherId) { user.publisherId = publisherId dbUser.publisherId = publisherId DbUser.save(dbUser) } - // coinAnimation - const userSettingRepository = getCustomRepository(UserSettingRepository) - const coinanimation = await userSettingRepository - .readBoolean(dbUser.id, Setting.COIN_ANIMATION) - .catch((error) => { - throw new Error(error) - }) - user.coinanimation = coinanimation - context.setHeaders.push({ key: 'token', value: encode(dbUser.pubKey), }) - + logger.info('successful Login:' + user) return user } @@ -274,6 +299,9 @@ export class UserResolver { // The functionality is fully client side - the client just needs to delete his token with the current implementation. // we could try to force this by sending `token: null` or `token: ''` with this call. But since it bares no real security // we should just return true for now. + logger.info('Logout...') + // remove user.pubKey from logger-context to ensure a correct filter on log-messages belonging to the same user + logger.addContext('user', 'unknown') return true } @@ -283,6 +311,9 @@ export class UserResolver { @Args() { email, firstName, lastName, language, publisherId, redeemCode = null }: CreateUserArgs, ): Promise { + logger.info( + `createUser(email=${email}, firstName=${firstName}, lastName=${lastName}, language=${language}, publisherId=${publisherId}, redeemCode =${redeemCode})`, + ) // TODO: wrong default value (should be null), how does graphql work here? Is it an required field? // default int publisher_id = 0; @@ -295,7 +326,9 @@ export class UserResolver { email = email.trim().toLowerCase() // TODO we cannot use repository.count(), since it does not allow to specify if you want to include the soft deletes const userFound = await DbUser.findOne({ email }, { withDeleted: true }) + logger.info(`DbUser.findOne(email=${email}) = ${userFound}`) if (userFound) { + logger.error('User already exists with this email=' + email) // TODO: this is unsecure, but the current implementation of the login server. This way it can be queried if the user with given EMail is existent. throw new Error(`User already exists.`) } @@ -314,8 +347,10 @@ export class UserResolver { dbUser.language = language dbUser.publisherId = publisherId dbUser.passphrase = passphrase.join(' ') + logger.debug('new dbUser=' + dbUser) if (redeemCode) { const transactionLink = await dbTransactionLink.findOne({ code: redeemCode }) + logger.info('redeemCode found transactionLink=' + transactionLink) if (transactionLink) { dbUser.referrerId = transactionLink.userId } @@ -332,15 +367,13 @@ export class UserResolver { await queryRunner.startTransaction('READ UNCOMMITTED') try { await queryRunner.manager.save(dbUser).catch((error) => { - // eslint-disable-next-line no-console - console.log('Error while saving dbUser', error) + logger.error('Error while saving dbUser', error) throw new Error('error saving user') }) const emailOptIn = newEmailOptIn(dbUser.id) await queryRunner.manager.save(emailOptIn).catch((error) => { - // eslint-disable-next-line no-console - console.log('Error while saving emailOptIn', error) + logger.error('Error while saving emailOptIn', error) throw new Error('error saving email opt in') }) @@ -357,31 +390,35 @@ export class UserResolver { email, duration: printTimeDuration(CONFIG.EMAIL_CODE_VALID_TIME), }) - - /* uncomment this, when you need the activation link on the console + logger.info(`sendAccountActivationEmail of ${firstName}.${lastName} to ${email}`) + /* uncomment this, when you need the activation link on the console */ // In case EMails are disabled log the activation link for the user if (!emailSent) { - // eslint-disable-next-line no-console - console.log(`Account confirmation link: ${activationLink}`) + logger.debug(`Account confirmation link: ${activationLink}`) } - */ await queryRunner.commitTransaction() } catch (e) { + logger.error(`error during create user with ${e}`) await queryRunner.rollbackTransaction() throw e } finally { await queryRunner.release() } + logger.info('createUser() successful...') return new User(dbUser) } @Authorized([RIGHTS.SEND_RESET_PASSWORD_EMAIL]) @Mutation(() => Boolean) async forgotPassword(@Arg('email') email: string): Promise { + logger.info(`forgotPassword(${email})...`) email = email.trim().toLowerCase() const user = await DbUser.findOne({ email }) - if (!user) return true + if (!user) { + logger.warn(`no user found with ${email}`) + return true + } // can be both types: REGISTER and RESET_PASSWORD let optInCode = await LoginEmailOptIn.findOne({ @@ -389,7 +426,7 @@ export class UserResolver { }) optInCode = await checkOptInCode(optInCode, user.id, OptInType.EMAIL_OPT_IN_RESET_PASSWORD) - + logger.info(`optInCode for ${email}=${optInCode}`) // eslint-disable-next-line @typescript-eslint/no-unused-vars const emailSent = await sendResetPasswordEmailMailer({ link: activationLink(optInCode), @@ -399,13 +436,12 @@ export class UserResolver { duration: printTimeDuration(CONFIG.EMAIL_CODE_VALID_TIME), }) - /* uncomment this, when you need the activation link on the console + /* uncomment this, when you need the activation link on the console */ // In case EMails are disabled log the activation link for the user if (!emailSent) { - // eslint-disable-next-line no-console - console.log(`Reset password link: ${link}`) + logger.debug(`Reset password link: ${activationLink(optInCode)}`) } - */ + logger.info(`forgotPassword(${email}) successful...`) return true } @@ -416,6 +452,7 @@ export class UserResolver { @Arg('code') code: string, @Arg('password') password: string, ): Promise { + logger.info(`setPassword(${code}, ***)...`) // Validate Password if (!isPassword(password)) { throw new Error( @@ -425,34 +462,44 @@ export class UserResolver { // Load code const optInCode = await LoginEmailOptIn.findOneOrFail({ verificationCode: code }).catch(() => { + logger.error('Could not login with emailVerificationCode') throw new Error('Could not login with emailVerificationCode') }) - + logger.debug('optInCode loaded...') // Code is only valid for `CONFIG.EMAIL_CODE_VALID_TIME` minutes if (!isOptInValid(optInCode)) { + logger.error( + `email was sent more than ${printTimeDuration(CONFIG.EMAIL_CODE_VALID_TIME)} ago`, + ) throw new Error( `email was sent more than ${printTimeDuration(CONFIG.EMAIL_CODE_VALID_TIME)} ago`, ) } + logger.debug('optInCode is valid...') // load user const user = await DbUser.findOneOrFail({ id: optInCode.userId }).catch(() => { + logger.error('Could not find corresponding Login User') throw new Error('Could not find corresponding Login User') }) + logger.debug('user with optInCode found...') // Generate Passphrase if needed if (!user.passphrase) { const passphrase = PassphraseGenerate() user.passphrase = passphrase.join(' ') + logger.debug('new Passphrase generated...') } const passphrase = user.passphrase.split(' ') if (passphrase.length < PHRASE_WORD_COUNT) { + logger.error('Could not load a correct passphrase') // TODO if this can happen we cannot recover from that // this seem to be good on production data, if we dont // make a coding mistake we do not have a problem here throw new Error('Could not load a correct passphrase') } + logger.debug('Passphrase is valid...') // Activate EMail user.emailChecked = true @@ -464,6 +511,7 @@ export class UserResolver { user.password = passwordHash[0].readBigUInt64LE() // using the shorthash user.pubKey = keyPair[0] user.privKey = encryptedPrivkey + logger.debug('User credentials updated ...') const queryRunner = getConnection().createQueryRunner() await queryRunner.connect() @@ -472,12 +520,15 @@ export class UserResolver { try { // Save user await queryRunner.manager.save(user).catch((error) => { + logger.error('error saving user: ' + error) throw new Error('error saving user: ' + error) }) await queryRunner.commitTransaction() + logger.info('User data written successfully...') } catch (e) { await queryRunner.rollbackTransaction() + logger.error('Error on writing User data:' + e) throw e } finally { await queryRunner.release() @@ -488,7 +539,11 @@ export class UserResolver { if (optInCode.emailOptInTypeId === OptInType.EMAIL_OPT_IN_REGISTER) { try { await klicktippSignIn(user.email, user.language, user.firstName, user.lastName) - } catch { + logger.debug( + `klicktippSignIn(${user.email}, ${user.language}, ${user.firstName}, ${user.lastName})`, + ) + } catch (e) { + logger.error('Error subscribe to klicktipp:' + e) // TODO is this a problem? // eslint-disable-next-line no-console /* uncomment this, when you need the activation link on the console @@ -503,13 +558,19 @@ export class UserResolver { @Authorized([RIGHTS.QUERY_OPT_IN]) @Query(() => Boolean) async queryOptIn(@Arg('optIn') optIn: string): Promise { + logger.info(`queryOptIn(${optIn})...`) const optInCode = await LoginEmailOptIn.findOneOrFail({ verificationCode: optIn }) + logger.debug(`found optInCode=${optInCode}`) // Code is only valid for `CONFIG.EMAIL_CODE_VALID_TIME` minutes if (!isOptInValid(optInCode)) { + logger.error( + `email was sent more than ${printTimeDuration(CONFIG.EMAIL_CODE_VALID_TIME)} ago`, + ) throw new Error( `email was sent more than ${printTimeDuration(CONFIG.EMAIL_CODE_VALID_TIME)} ago`, ) } + logger.info(`queryOptIn(${optIn}) successful...`) return true } @@ -517,9 +578,10 @@ export class UserResolver { @Mutation(() => Boolean) async updateUserInfos( @Args() - { firstName, lastName, language, password, passwordNew, coinanimation }: UpdateUserInfosArgs, + { firstName, lastName, language, password, passwordNew }: UpdateUserInfosArgs, @Ctx() context: Context, ): Promise { + logger.info(`updateUserInfos(${firstName}, ${lastName}, ${language}, ***, ***)...`) const userEntity = getUser(context) if (firstName) { @@ -532,6 +594,7 @@ export class UserResolver { if (language) { if (!isLanguage(language)) { + logger.error(`"${language}" isn't a valid language`) throw new Error(`"${language}" isn't a valid language`) } userEntity.language = language @@ -540,6 +603,7 @@ export class UserResolver { if (password && passwordNew) { // Validate Password if (!isPassword(passwordNew)) { + logger.error('newPassword does not fullfil the rules') throw new Error( 'Please enter a valid password with at least 8 characters, upper and lower case letters, at least one number and one special character!', ) @@ -548,13 +612,16 @@ export class UserResolver { // TODO: This had some error cases defined - like missing private key. This is no longer checked. const oldPasswordHash = SecretKeyCryptographyCreateKey(userEntity.email, password) if (BigInt(userEntity.password.toString()) !== oldPasswordHash[0].readBigUInt64LE()) { + logger.error(`Old password is invalid`) throw new Error(`Old password is invalid`) } const privKey = SecretKeyCryptographyDecrypt(userEntity.privKey, oldPasswordHash[1]) - + logger.debug('oldPassword decrypted...') const newPasswordHash = SecretKeyCryptographyCreateKey(userEntity.email, passwordNew) // return short and long hash + logger.debug('newPasswordHash created...') const encryptedPrivkey = SecretKeyCryptographyEncrypt(privKey, newPasswordHash[1]) + logger.debug('PrivateKey encrypted...') // Save new password hash and newly encrypted private key userEntity.password = newPasswordHash[0].readBigUInt64LE() @@ -566,39 +633,35 @@ export class UserResolver { await queryRunner.startTransaction('READ UNCOMMITTED') try { - if (coinanimation !== null && coinanimation !== undefined) { - queryRunner.manager - .getCustomRepository(UserSettingRepository) - .setOrUpdate(userEntity.id, Setting.COIN_ANIMATION, coinanimation.toString()) - .catch((error) => { - throw new Error('error saving coinanimation: ' + error) - }) - } - await queryRunner.manager.save(userEntity).catch((error) => { throw new Error('error saving user: ' + error) }) await queryRunner.commitTransaction() + logger.debug('writing User data successful...') } catch (e) { await queryRunner.rollbackTransaction() + logger.error(`error on writing updated user data: ${e}`) throw e } finally { await queryRunner.release() } - + logger.info('updateUserInfos() successfully finished...') return true } @Authorized([RIGHTS.HAS_ELOPAGE]) @Query(() => Boolean) async hasElopage(@Ctx() context: Context): Promise { + logger.info(`hasElopage()...`) const userEntity = context.user if (!userEntity) { + logger.info('missing context.user for EloPage-check') return false } - - return hasElopageBuys(userEntity.email) + const elopageBuys = hasElopageBuys(userEntity.email) + logger.debug(`has ElopageBuys = ${elopageBuys}`) + return elopageBuys } } diff --git a/backend/src/mailer/sendEMail.test.ts b/backend/src/mailer/sendEMail.test.ts index b7cc06a60..8a13c027d 100644 --- a/backend/src/mailer/sendEMail.test.ts +++ b/backend/src/mailer/sendEMail.test.ts @@ -2,6 +2,8 @@ import { sendEMail } from './sendEMail' import { createTransport } from 'nodemailer' import CONFIG from '@/config' +import { logger } from '@test/testSetup' + CONFIG.EMAIL = false CONFIG.EMAIL_SMTP_URL = 'EMAIL_SMTP_URL' CONFIG.EMAIL_SMTP_PORT = '1234' @@ -26,11 +28,6 @@ jest.mock('nodemailer', () => { describe('sendEMail', () => { let result: boolean describe('config email is false', () => { - // eslint-disable-next-line no-console - const consoleLog = console.log - const consoleLogMock = jest.fn() - // eslint-disable-next-line no-console - console.log = consoleLogMock beforeEach(async () => { result = await sendEMail({ to: 'receiver@mail.org', @@ -39,13 +36,8 @@ describe('sendEMail', () => { }) }) - afterAll(() => { - // eslint-disable-next-line no-console - console.log = consoleLog - }) - - it('logs warining to console', () => { - expect(consoleLogMock).toBeCalledWith('Emails are disabled via config') + it('logs warining', () => { + expect(logger.info).toBeCalledWith('Emails are disabled via config...') }) it('returns false', () => { diff --git a/backend/src/mailer/sendEMail.ts b/backend/src/mailer/sendEMail.ts index 13c28996b..640dd7f4c 100644 --- a/backend/src/mailer/sendEMail.ts +++ b/backend/src/mailer/sendEMail.ts @@ -1,3 +1,4 @@ +import { backendLogger as logger } from '@/server/logger' import { createTransport } from 'nodemailer' import CONFIG from '@/config' @@ -7,9 +8,10 @@ export const sendEMail = async (emailDef: { subject: string text: string }): Promise => { + logger.info(`send Email: to=${emailDef.to}, subject=${emailDef.subject}, text=${emailDef.text}`) + if (!CONFIG.EMAIL) { - // eslint-disable-next-line no-console - console.log('Emails are disabled via config') + logger.info(`Emails are disabled via config...`) return false } const transporter = createTransport({ @@ -27,7 +29,9 @@ export const sendEMail = async (emailDef: { from: `Gradido (nicht antworten) <${CONFIG.EMAIL_SENDER}>`, }) if (!info.messageId) { + logger.error('error sending notification email, but transaction succeed') throw new Error('error sending notification email, but transaction succeed') } + logger.info('send Email successfully.') return true } diff --git a/backend/src/mailer/sendTransactionReceivedEmail.ts b/backend/src/mailer/sendTransactionReceivedEmail.ts index 537c13d85..692f92f9a 100644 --- a/backend/src/mailer/sendTransactionReceivedEmail.ts +++ b/backend/src/mailer/sendTransactionReceivedEmail.ts @@ -1,3 +1,4 @@ +import { backendLogger as logger } from '@/server/logger' import Decimal from 'decimal.js-light' import { sendEMail } from './sendEMail' import { transactionReceived } from './text/transactionReceived' @@ -13,6 +14,12 @@ export const sendTransactionReceivedEmail = (data: { memo: string overviewURL: string }): Promise => { + logger.info( + `sendEmail(): to=${data.recipientFirstName} ${data.recipientLastName}, + <${data.email}>, + subject=${transactionReceived.de.subject}, + text=${transactionReceived.de.text(data)}`, + ) return sendEMail({ to: `${data.recipientFirstName} ${data.recipientLastName} <${data.email}>`, subject: transactionReceived.de.subject, diff --git a/backend/src/seeds/graphql/mutations.ts b/backend/src/seeds/graphql/mutations.ts index d3026dbdd..6e1fe9174 100644 --- a/backend/src/seeds/graphql/mutations.ts +++ b/backend/src/seeds/graphql/mutations.ts @@ -31,7 +31,6 @@ export const updateUserInfos = gql` $password: String $passwordNew: String $locale: String - $coinanimation: Boolean ) { updateUserInfos( firstName: $firstName @@ -39,7 +38,6 @@ export const updateUserInfos = gql` password: $password passwordNew: $passwordNew language: $locale - coinanimation: $coinanimation ) } ` @@ -107,6 +105,35 @@ export const unDeleteUser = gql` } ` +export const searchUsers = gql` + query ( + $searchText: String! + $currentPage: Int + $pageSize: Int + $filters: SearchUsersFiltersInput + ) { + searchUsers( + searchText: $searchText + currentPage: $currentPage + pageSize: $pageSize + filters: $filters + ) { + userCount + userList { + userId + firstName + lastName + email + creation + emailChecked + hasElopage + emailConfirmationSend + deletedAt + } + } + } +` + export const createPendingCreations = gql` mutation ($pendingCreations: [CreatePendingCreationArgs!]!) { createPendingCreations(pendingCreations: $pendingCreations) { diff --git a/backend/src/seeds/graphql/queries.ts b/backend/src/seeds/graphql/queries.ts index 82067c968..16b2b71ae 100644 --- a/backend/src/seeds/graphql/queries.ts +++ b/backend/src/seeds/graphql/queries.ts @@ -8,7 +8,6 @@ export const login = gql` firstName lastName language - coinanimation klickTipp { newsletterState } @@ -26,7 +25,6 @@ export const verifyLogin = gql` firstName lastName language - coinanimation klickTipp { newsletterState } diff --git a/backend/src/seeds/index.ts b/backend/src/seeds/index.ts index 21539e1ba..710f255ee 100644 --- a/backend/src/seeds/index.ts +++ b/backend/src/seeds/index.ts @@ -29,7 +29,7 @@ const context = { } export const cleanDB = async () => { - // this only works as lond we do not have foreign key constraints + // this only works as long we do not have foreign key constraints for (let i = 0; i < entities.length; i++) { await resetEntity(entities[i]) } diff --git a/backend/src/server/createServer.ts b/backend/src/server/createServer.ts index 8315fda58..a0b294281 100644 --- a/backend/src/server/createServer.ts +++ b/backend/src/server/createServer.ts @@ -22,22 +22,32 @@ import schema from '@/graphql/schema' import { elopageWebhook } from '@/webhook/elopage' import { Connection } from '@dbTools/typeorm' +import { apolloLogger } from './logger' +import { Logger } from 'log4js' + // TODO implement // import queryComplexity, { simpleEstimator, fieldConfigEstimator } from "graphql-query-complexity"; type ServerDef = { apollo: ApolloServer; app: Express; con: Connection } -// eslint-disable-next-line @typescript-eslint/no-explicit-any -const createServer = async (context: any = serverContext): Promise => { +const createServer = async ( + // eslint-disable-next-line @typescript-eslint/no-explicit-any + context: any = serverContext, + logger: Logger = apolloLogger, +): Promise => { + logger.debug('createServer...') + // open mysql connection const con = await connection() if (!con || !con.isConnected) { + logger.fatal(`Couldn't open connection to database!`) throw new Error(`Fatal: Couldn't open connection to database`) } // check for correct database version const dbVersion = await checkDBVersion(CONFIG.DB_VERSION) if (!dbVersion) { + logger.fatal('Fatal: Database Version incorrect') throw new Error('Fatal: Database Version incorrect') } @@ -62,8 +72,10 @@ const createServer = async (context: any = serverContext): Promise => introspection: CONFIG.GRAPHIQL, context, plugins, + logger, }) apollo.applyMiddleware({ app, path: '/' }) + logger.debug('createServer...successful') return { apollo, app, con } } diff --git a/backend/src/server/logger.ts b/backend/src/server/logger.ts new file mode 100644 index 000000000..27d0cf75b --- /dev/null +++ b/backend/src/server/logger.ts @@ -0,0 +1,17 @@ +import log4js from 'log4js' +import CONFIG from '@/config' + +import { readFileSync } from 'fs' + +const options = JSON.parse(readFileSync(CONFIG.LOG4JS_CONFIG, 'utf-8')) + +options.categories.default.level = CONFIG.LOG_LEVEL + +log4js.configure(options) + +const apolloLogger = log4js.getLogger('apollo') +const backendLogger = log4js.getLogger('backend') + +backendLogger.addContext('user', 'unknown') + +export { apolloLogger, backendLogger } diff --git a/backend/src/server/plugins.ts b/backend/src/server/plugins.ts index a407135ea..134ca1bb9 100644 --- a/backend/src/server/plugins.ts +++ b/backend/src/server/plugins.ts @@ -1,8 +1,7 @@ /* eslint-disable @typescript-eslint/no-explicit-any */ /* eslint-disable @typescript-eslint/explicit-module-boundary-types */ -import { ApolloLogPlugin, LogMutateData } from 'apollo-log' -import cloneDeep from 'lodash.clonedeep' +import clonedeep from 'lodash.clonedeep' const setHeadersPlugin = { requestDidStart() { @@ -22,24 +21,35 @@ const setHeadersPlugin = { }, } -const apolloLogPlugin = ApolloLogPlugin({ - mutate: (data: LogMutateData) => { - // We need to deep clone the object in order to not modify the actual request - const dataCopy = cloneDeep(data) +const filterVariables = (variables: any) => { + const vars = clonedeep(variables) + if (vars.password) vars.password = '***' + if (vars.passwordNew) vars.passwordNew = '***' + return vars +} - // mask password if part of the query - if (dataCopy.context.request.variables && dataCopy.context.request.variables.password) { - dataCopy.context.request.variables.password = '***' +const logPlugin = { + requestDidStart(requestContext: any) { + const { logger } = requestContext + const { query, mutation, variables } = requestContext.request + logger.info(`Request: +${mutation || query}variables: ${JSON.stringify(filterVariables(variables), null, 2)}`) + return { + willSendResponse(requestContext: any) { + if (requestContext.context.user) logger.info(`User ID: ${requestContext.context.user.id}`) + if (requestContext.response.data) + logger.info(`Response-Data: +${JSON.stringify(requestContext.response.data, null, 2)}`) + if (requestContext.response.errors) + logger.error(`Response-Errors: +${JSON.stringify(requestContext.response.errors, null, 2)}`) + return requestContext + }, } - - // mask token at all times - dataCopy.context.context.token = '***' - - return dataCopy }, -}) +} const plugins = - process.env.NODE_ENV === 'development' ? [setHeadersPlugin] : [setHeadersPlugin, apolloLogPlugin] + process.env.NODE_ENV === 'development' ? [setHeadersPlugin] : [setHeadersPlugin, logPlugin] export default plugins diff --git a/backend/src/typeorm/DBVersion.ts b/backend/src/typeorm/DBVersion.ts index a8cb70489..cb53c49f1 100644 --- a/backend/src/typeorm/DBVersion.ts +++ b/backend/src/typeorm/DBVersion.ts @@ -1,12 +1,12 @@ import { Migration } from '@entity/Migration' +import { backendLogger as logger } from '@/server/logger' const getDBVersion = async (): Promise => { try { const dbVersion = await Migration.findOne({ order: { version: 'DESC' } }) return dbVersion ? dbVersion.fileName : null } catch (error) { - // eslint-disable-next-line no-console - console.log(error) + logger.error(error) return null } } @@ -14,8 +14,7 @@ const getDBVersion = async (): Promise => { const checkDBVersion = async (DB_VERSION: string): Promise => { const dbVersion = await getDBVersion() if (!dbVersion || dbVersion.indexOf(DB_VERSION) === -1) { - // eslint-disable-next-line no-console - console.log( + logger.error( `Wrong database version detected - the backend requires '${DB_VERSION}' but found '${ dbVersion || 'None' }`, diff --git a/backend/src/typeorm/connection.ts b/backend/src/typeorm/connection.ts index 745b2da94..d08d935d4 100644 --- a/backend/src/typeorm/connection.ts +++ b/backend/src/typeorm/connection.ts @@ -20,6 +20,9 @@ const connection = async (): Promise => { logger: new FileLogger('all', { logPath: CONFIG.TYPEORM_LOGGING_RELATIVE_PATH, }), + extra: { + charset: 'utf8mb4_unicode_ci', + }, }) } catch (error) { // eslint-disable-next-line no-console diff --git a/backend/src/typeorm/repository/UserSettingRepository.ts b/backend/src/typeorm/repository/UserSettingRepository.ts index 528090ff2..f911cfd1a 100644 --- a/backend/src/typeorm/repository/UserSettingRepository.ts +++ b/backend/src/typeorm/repository/UserSettingRepository.ts @@ -1,33 +1,22 @@ import { EntityRepository, Repository } from '@dbTools/typeorm' import { UserSetting } from '@entity/UserSetting' -import { Setting } from '@enum/Setting' import { isStringBoolean } from '@/util/validate' @EntityRepository(UserSetting) export class UserSettingRepository extends Repository { - async setOrUpdate(userId: number, key: Setting, value: string): Promise { - switch (key) { - case Setting.COIN_ANIMATION: - if (!isStringBoolean(value)) { - throw new Error("coinanimation value isn't boolean") - } - break - default: - throw new Error("key isn't defined: " + key) - } - let entity = await this.findOne({ userId: userId, key: key }) + async setOrUpdate(userId: number, value: string): Promise { + let entity = await this.findOne({ userId: userId }) if (!entity) { entity = new UserSetting() entity.userId = userId - entity.key = key } entity.value = value return this.save(entity) } - async readBoolean(userId: number, key: Setting): Promise { - const entity = await this.findOne({ userId: userId, key: key }) + async readBoolean(userId: number): Promise { + const entity = await this.findOne({ userId: userId }) if (!entity || !isStringBoolean(entity.value)) { return true } diff --git a/backend/src/util/utilities.ts b/backend/src/util/utilities.ts new file mode 100644 index 000000000..f77ad05ec --- /dev/null +++ b/backend/src/util/utilities.ts @@ -0,0 +1,5 @@ +export const convertObjValuesToArray = (obj: { [x: string]: string }): Array => { + return Object.keys(obj).map(function (key) { + return obj[key] + }) +} diff --git a/backend/test/helpers.ts b/backend/test/helpers.ts index 51610b07e..6e1856b63 100644 --- a/backend/test/helpers.ts +++ b/backend/test/helpers.ts @@ -25,8 +25,8 @@ export const cleanDB = async () => { } } -export const testEnvironment = async () => { - const server = await createServer(context) +export const testEnvironment = async (logger?: any) => { + const server = await createServer(context, logger) const con = server.con const testClient = createTestClient(server.apollo) const mutate = testClient.mutate diff --git a/backend/test/testSetup.ts b/backend/test/testSetup.ts index d42836626..a43335e55 100644 --- a/backend/test/testSetup.ts +++ b/backend/test/testSetup.ts @@ -1,7 +1,22 @@ -/* eslint-disable no-console */ +import { backendLogger as logger } from '@/server/logger' -// disable console.info for apollo log - -// eslint-disable-next-line @typescript-eslint/no-empty-function -console.info = () => {} jest.setTimeout(1000000) + +jest.mock('@/server/logger', () => { + const originalModule = jest.requireActual('@/server/logger') + return { + __esModule: true, + ...originalModule, + backendLogger: { + addContext: jest.fn(), + trace: jest.fn(), + debug: jest.fn(), + warn: jest.fn(), + info: jest.fn(), + error: jest.fn(), + fatal: jest.fn(), + }, + } +}) + +export { logger } diff --git a/backend/yarn.lock b/backend/yarn.lock index f37b64d11..53a53cb9b 100644 --- a/backend/yarn.lock +++ b/backend/yarn.lock @@ -2,7 +2,7 @@ # yarn lockfile v1 -"@apollo/protobufjs@1.2.2", "@apollo/protobufjs@^1.0.3": +"@apollo/protobufjs@1.2.2": version "1.2.2" resolved "https://registry.yarnpkg.com/@apollo/protobufjs/-/protobufjs-1.2.2.tgz#4bd92cd7701ccaef6d517cdb75af2755f049f87c" integrity sha512-vF+zxhPiLtkwxONs6YanSt1EpwpGilThpneExUN5K3tCymuxNnVq2yojTvnpRjv2QfsEIt/n7ozPIIzBLwGIDQ== @@ -1265,24 +1265,6 @@ apollo-link@^1.2.14: tslib "^1.9.3" zen-observable-ts "^0.8.21" -apollo-log@^1.1.0: - version "1.1.0" - resolved "https://registry.yarnpkg.com/apollo-log/-/apollo-log-1.1.0.tgz#e21287c917cf735b77adc06f07034f965e9b24de" - integrity sha512-TciLu+85LSqk7t7ZGKrYN5jFiCcRMLujBjrLiOQGHGgVVkvmKlwK0oELSS9kiHQIhTq23p8qVVWb08spLpQ7Jw== - dependencies: - apollo-server-plugin-base "^0.10.4" - chalk "^4.1.0" - fast-safe-stringify "^2.0.7" - loglevelnext "^4.0.1" - nanoid "^3.1.20" - -apollo-reporting-protobuf@^0.6.2: - version "0.6.2" - resolved "https://registry.yarnpkg.com/apollo-reporting-protobuf/-/apollo-reporting-protobuf-0.6.2.tgz#5572866be9b77f133916532b10e15fbaa4158304" - integrity sha512-WJTJxLM+MRHNUxt1RTl4zD0HrLdH44F2mDzMweBj1yHL0kSt8I1WwoiF/wiGVSpnG48LZrBegCaOJeuVbJTbtw== - dependencies: - "@apollo/protobufjs" "^1.0.3" - apollo-reporting-protobuf@^0.8.0: version "0.8.0" resolved "https://registry.yarnpkg.com/apollo-reporting-protobuf/-/apollo-reporting-protobuf-0.8.0.tgz#ae9d967934d3d8ed816fc85a0d8068ef45c371b9" @@ -1290,13 +1272,6 @@ apollo-reporting-protobuf@^0.8.0: dependencies: "@apollo/protobufjs" "1.2.2" -apollo-server-caching@^0.5.3: - version "0.5.3" - resolved "https://registry.yarnpkg.com/apollo-server-caching/-/apollo-server-caching-0.5.3.tgz#cf42a77ad09a46290a246810075eaa029b5305e1" - integrity sha512-iMi3087iphDAI0U2iSBE9qtx9kQoMMEWr6w+LwXruBD95ek9DWyj7OeC2U/ngLjRsXM43DoBDXlu7R+uMjahrQ== - dependencies: - lru-cache "^6.0.0" - apollo-server-caching@^0.7.0: version "0.7.0" resolved "https://registry.yarnpkg.com/apollo-server-caching/-/apollo-server-caching-0.7.0.tgz#e6d1e68e3bb571cba63a61f60b434fb771c6ff39" @@ -1335,7 +1310,7 @@ apollo-server-core@^2.25.2: subscriptions-transport-ws "^0.9.19" uuid "^8.0.0" -apollo-server-env@^3.0.0, apollo-server-env@^3.1.0: +apollo-server-env@^3.1.0: version "3.1.0" resolved "https://registry.yarnpkg.com/apollo-server-env/-/apollo-server-env-3.1.0.tgz#0733c2ef50aea596cc90cf40a53f6ea2ad402cd0" integrity sha512-iGdZgEOAuVop3vb0F2J3+kaBVi4caMoxefHosxmgzAbbSpvWehB8Y1QiSyyMeouYC38XNVk5wnZl+jdGSsWsIQ== @@ -1371,13 +1346,6 @@ apollo-server-express@^2.25.2: subscriptions-transport-ws "^0.9.19" type-is "^1.6.16" -apollo-server-plugin-base@^0.10.4: - version "0.10.4" - resolved "https://registry.yarnpkg.com/apollo-server-plugin-base/-/apollo-server-plugin-base-0.10.4.tgz#fbf73f64f95537ca9f9639dd7c535eb5eeb95dcd" - integrity sha512-HRhbyHgHFTLP0ImubQObYhSgpmVH4Rk1BinnceZmwudIVLKrqayIVOELdyext/QnSmmzg5W7vF3NLGBcVGMqDg== - dependencies: - apollo-server-types "^0.6.3" - apollo-server-plugin-base@^0.13.0: version "0.13.0" resolved "https://registry.yarnpkg.com/apollo-server-plugin-base/-/apollo-server-plugin-base-0.13.0.tgz#3f85751a420d3c4625355b6cb3fbdd2acbe71f13" @@ -1392,15 +1360,6 @@ apollo-server-testing@^2.25.2: dependencies: apollo-server-core "^2.25.2" -apollo-server-types@^0.6.3: - version "0.6.3" - resolved "https://registry.yarnpkg.com/apollo-server-types/-/apollo-server-types-0.6.3.tgz#f7aa25ff7157863264d01a77d7934aa6e13399e8" - integrity sha512-aVR7SlSGGY41E1f11YYz5bvwA89uGmkVUtzMiklDhZ7IgRJhysT5Dflt5IuwDxp+NdQkIhVCErUXakopocFLAg== - dependencies: - apollo-reporting-protobuf "^0.6.2" - apollo-server-caching "^0.5.3" - apollo-server-env "^3.0.0" - apollo-server-types@^0.9.0: version "0.9.0" resolved "https://registry.yarnpkg.com/apollo-server-types/-/apollo-server-types-0.9.0.tgz#ccf550b33b07c48c72f104fbe2876232b404848b" @@ -1952,6 +1911,11 @@ data-urls@^2.0.0: whatwg-mimetype "^2.3.0" whatwg-url "^8.0.0" +date-format@^4.0.9: + version "4.0.9" + resolved "https://registry.yarnpkg.com/date-format/-/date-format-4.0.9.tgz#4788015ac56dedebe83b03bc361f00c1ddcf1923" + integrity sha512-+8J+BOUpSrlKLQLeF8xJJVTxS8QfRSuJgwxSVvslzgO3E6khbI0F5mMEPf5mTYhCCm4h99knYP6H3W9n3BQFrg== + debug@2.6.9, debug@^2.2.0, debug@^2.6.9: version "2.6.9" resolved "https://registry.yarnpkg.com/debug/-/debug-2.6.9.tgz#5d128515df134ff327e90a4c93f4e077a536341f" @@ -1973,6 +1937,13 @@ debug@^3.2.6, debug@^3.2.7: dependencies: ms "^2.1.1" +debug@^4.3.4: + version "4.3.4" + resolved "https://registry.yarnpkg.com/debug/-/debug-4.3.4.tgz#1319f6579357f2338d3337d2cdd4914bb5dcc865" + integrity sha512-PRWFHuSU3eDtQJPvnNY7Jcket1j0t5OuOsFzPPzsekD52Zl8qUfFIPEiswXqIvHWGVHOgX+7G/vCNNhehwxfkQ== + dependencies: + ms "2.1.2" + decimal.js-light@^2.5.1: version "2.5.1" resolved "https://registry.yarnpkg.com/decimal.js-light/-/decimal.js-light-2.5.1.tgz#134fd32508f19e208f4fb2f8dac0d2626a867934" @@ -2558,11 +2529,6 @@ fast-levenshtein@^2.0.6, fast-levenshtein@~2.0.6: resolved "https://registry.yarnpkg.com/fast-levenshtein/-/fast-levenshtein-2.0.6.tgz#3d8a5c66883a16a30ca8643e851f19baa7797917" integrity sha1-PYpcZog6FqMMqGQ+hR8Zuqd5eRc= -fast-safe-stringify@^2.0.7: - version "2.1.1" - resolved "https://registry.yarnpkg.com/fast-safe-stringify/-/fast-safe-stringify-2.1.1.tgz#c406a83b6e70d9e35ce3b30a81141df30aeba884" - integrity sha512-W+KJc2dmILlPplD/H4K9l9LcAHAfPtP6BY84uVLXQ6Evcz9Lcg33Y2z1IVblT6xdY54PXYVHEv+0Wpq8Io6zkA== - fastq@^1.6.0: version "1.13.0" resolved "https://registry.yarnpkg.com/fastq/-/fastq-1.13.0.tgz#616760f88a7526bdfc596b7cab8c18938c36b98c" @@ -2632,6 +2598,11 @@ flatted@^3.1.0: resolved "https://registry.yarnpkg.com/flatted/-/flatted-3.2.2.tgz#64bfed5cb68fe3ca78b3eb214ad97b63bedce561" integrity sha512-JaTY/wtrcSyvXJl4IMFHPKyFur1sE9AUqc0QnhOaJ0CxHtAoIV8pYDzeEfAaNEtGkOfq4gr3LBFmdXW5mOQFnA== +flatted@^3.2.5: + version "3.2.5" + resolved "https://registry.yarnpkg.com/flatted/-/flatted-3.2.5.tgz#76c8584f4fc843db64702a6bd04ab7a8bd666da3" + integrity sha512-WIWGi2L3DyTUvUrwRKgGi9TwxQMUEqPOPQBVi71R96jZXJdFskXEmf54BoZaS1kknGODoIGASGEzBUYdyMCBJg== + follow-redirects@^1.14.0: version "1.14.4" resolved "https://registry.yarnpkg.com/follow-redirects/-/follow-redirects-1.14.4.tgz#838fdf48a8bbdd79e52ee51fb1c94e3ed98b9379" @@ -2668,6 +2639,15 @@ fs-capacitor@^2.0.4: resolved "https://registry.yarnpkg.com/fs-capacitor/-/fs-capacitor-2.0.4.tgz#5a22e72d40ae5078b4fe64fe4d08c0d3fc88ad3c" integrity sha512-8S4f4WsCryNw2mJJchi46YgB6CR5Ze+4L1h8ewl9tEpL4SJ3ZO+c/bS4BWhB8bK+O3TMqhuZarTitd0S0eh2pA== +fs-extra@^10.1.0: + version "10.1.0" + resolved "https://registry.yarnpkg.com/fs-extra/-/fs-extra-10.1.0.tgz#02873cfbc4084dde127eaa5f9905eef2325d1abf" + integrity sha512-oRXApq54ETRj4eMiFzGnHWGy+zo5raudjuxN0b8H7s/RU2oW0Wvsx9O0ACRN/kRq9E8Vu/ReskGB5o3ji+FzHQ== + dependencies: + graceful-fs "^4.2.0" + jsonfile "^6.0.1" + universalify "^2.0.0" + fs.realpath@^1.0.0: version "1.0.0" resolved "https://registry.yarnpkg.com/fs.realpath/-/fs.realpath-1.0.0.tgz#1504ad2523158caa40db4a2787cb01411994ea4f" @@ -2818,6 +2798,11 @@ graceful-fs@^4.1.2, graceful-fs@^4.2.4: resolved "https://registry.yarnpkg.com/graceful-fs/-/graceful-fs-4.2.8.tgz#e412b8d33f5e006593cbd3cee6df9f2cebbe802a" integrity sha512-qkIilPUYcNhJpd33n0GBXTB1MMPp14TxEsEs0pTrsSVucApsYzW5V+Q8Qxhik6KU3evy+qkAAowTByymK0avdg== +graceful-fs@^4.1.6, graceful-fs@^4.2.0: + version "4.2.10" + resolved "https://registry.yarnpkg.com/graceful-fs/-/graceful-fs-4.2.10.tgz#147d3a006da4ca3ce14728c7aefc287c367d7a6c" + integrity sha512-9ByhssR2fPVsNZj478qUUbKfmL0+t5BDVyjShtyZZLiK7ZDAArFFfopyOTj0M05wE2tJPisA4iTnnXl2YoPvOA== + graphql-extensions@^0.15.0: version "0.15.0" resolved "https://registry.yarnpkg.com/graphql-extensions/-/graphql-extensions-0.15.0.tgz#3f291f9274876b0c289fa4061909a12678bd9817" @@ -3810,6 +3795,15 @@ json5@^1.0.1: dependencies: minimist "^1.2.0" +jsonfile@^6.0.1: + version "6.1.0" + resolved "https://registry.yarnpkg.com/jsonfile/-/jsonfile-6.1.0.tgz#bc55b2634793c679ec6403094eb13698a6ec0aae" + integrity sha512-5dgndWOriYSm5cnYaJNhalLNDKOqFwyDB/rr1E9ZsGciGvKPs8R2xYGCacuf3z6K1YKDz182fd+fY3cn3pMqXQ== + dependencies: + universalify "^2.0.0" + optionalDependencies: + graceful-fs "^4.1.6" + jsonwebtoken@^8.5.1: version "8.5.1" resolved "https://registry.yarnpkg.com/jsonwebtoken/-/jsonwebtoken-8.5.1.tgz#00e71e0b8df54c2121a1f26137df2280673bcc0d" @@ -3978,16 +3972,22 @@ lodash@4.x, lodash@^4.7.0: resolved "https://registry.yarnpkg.com/lodash/-/lodash-4.17.21.tgz#679591c564c3bffaae8454cf0b3df370c3d6911c" integrity sha512-v2kDEe57lecTulaDIuNTPy3Ry4gLGJ6Z1O3vE1krgXZNrsQ+LFTGHVxVjcXPs17LhbZVGedAJv8XZ1tvj5FvSg== +log4js@^6.4.6: + version "6.4.6" + resolved "https://registry.yarnpkg.com/log4js/-/log4js-6.4.6.tgz#1878aa3f09973298ecb441345fe9dd714e355c15" + integrity sha512-1XMtRBZszmVZqPAOOWczH+Q94AI42mtNWjvjA5RduKTSWjEc56uOBbyM1CJnfN4Ym0wSd8cQ43zOojlSHgRDAw== + dependencies: + date-format "^4.0.9" + debug "^4.3.4" + flatted "^3.2.5" + rfdc "^1.3.0" + streamroller "^3.0.8" + loglevel@^1.6.7: version "1.7.1" resolved "https://registry.yarnpkg.com/loglevel/-/loglevel-1.7.1.tgz#005fde2f5e6e47068f935ff28573e125ef72f197" integrity sha512-Hesni4s5UkWkwCGJMQGAh71PaLUmKFM60dHvq0zi/vDhhrzuk+4GgNbTXJ12YYQJn6ZKBDNIjYcuQGKudvqrIw== -loglevelnext@^4.0.1: - version "4.0.1" - resolved "https://registry.yarnpkg.com/loglevelnext/-/loglevelnext-4.0.1.tgz#4406c6348c243a35272ac75d7d8e4e60ecbcd011" - integrity sha512-/tlMUn5wqgzg9msy0PiWc+8fpVXEuYPq49c2RGyw2NAh0hSrgq6j/Z3YPnwWsILMoFJ+ZT6ePHnWUonkjDnq2Q== - long@^4.0.0: version "4.0.0" resolved "https://registry.yarnpkg.com/long/-/long-4.0.0.tgz#9a7b71cfb7d361a194ea555241c92f7468d5bf28" @@ -4150,11 +4150,6 @@ named-placeholders@^1.1.2: dependencies: lru-cache "^4.1.3" -nanoid@^3.1.20: - version "3.1.32" - resolved "https://registry.yarnpkg.com/nanoid/-/nanoid-3.1.32.tgz#8f96069e6239cc0a9ae8c0d3b41a3b4933a88c0a" - integrity sha512-F8mf7R3iT9bvThBoW4tGXhXFHCctyCiUUPrWF8WaTqa3h96d9QybkSeba43XVOOE3oiLfkVDe4bT8MeGmkrTxw== - natural-compare@^1.4.0: version "1.4.0" resolved "https://registry.yarnpkg.com/natural-compare/-/natural-compare-1.4.0.tgz#4abebfeed7541f2c27acfb29bdbbd15c8d5ba4f7" @@ -4746,6 +4741,11 @@ reusify@^1.0.4: resolved "https://registry.yarnpkg.com/reusify/-/reusify-1.0.4.tgz#90da382b1e126efc02146e90845a88db12925d76" integrity sha512-U9nH88a3fc/ekCF1l0/UP1IosiuIjyTh7hBvXVMHYgVcfGvt897Xguj2UOLDeI5BG2m7/uwyaLVT6fbtCwTyzw== +rfdc@^1.3.0: + version "1.3.0" + resolved "https://registry.yarnpkg.com/rfdc/-/rfdc-1.3.0.tgz#d0b7c441ab2720d05dc4cf26e01c89631d9da08b" + integrity sha512-V2hovdzFbOi77/WajaSMXk2OLm+xNIeQdMMuB7icj7bk6zi2F8GGAxigcnDFpJHbNyNcgyJDiP+8nOrY5cZGrA== + rimraf@^3.0.0, rimraf@^3.0.2: version "3.0.2" resolved "https://registry.yarnpkg.com/rimraf/-/rimraf-3.0.2.tgz#f1a5402ba6220ad52cc1282bac1ae3aa49fd061a" @@ -4981,6 +4981,15 @@ stack-utils@^2.0.3: resolved "https://registry.yarnpkg.com/statuses/-/statuses-1.5.0.tgz#161c7dac177659fd9811f43771fa99381478628c" integrity sha1-Fhx9rBd2Wf2YEfQ3cfqZOBR4Yow= +streamroller@^3.0.8: + version "3.0.8" + resolved "https://registry.yarnpkg.com/streamroller/-/streamroller-3.0.8.tgz#84b190e4080ee311ca1ebe0444e30ac8eedd028d" + integrity sha512-VI+ni3czbFZrd1MrlybxykWZ8sMDCMtTU7YJyhgb9M5X6d1DDxLdJr+gSnmRpXPMnIWxWKMaAE8K0WumBp3lDg== + dependencies: + date-format "^4.0.9" + debug "^4.3.4" + fs-extra "^10.1.0" + streamsearch@0.1.2: version "0.1.2" resolved "https://registry.yarnpkg.com/streamsearch/-/streamsearch-0.1.2.tgz#808b9d0e56fc273d809ba57338e929919a1a9f1a" @@ -5363,6 +5372,11 @@ universalify@^0.1.2: resolved "https://registry.yarnpkg.com/universalify/-/universalify-0.1.2.tgz#b646f69be3942dabcecc9d6639c80dc105efaa66" integrity sha512-rBJeI5CXAlmy1pV+617WB9J63U6XcazHHF2f2dbJix4XzpUF0RS3Zbj0FGIOCAva5P/d/GBOYaACQ1w+0azUkg== +universalify@^2.0.0: + version "2.0.0" + resolved "https://registry.yarnpkg.com/universalify/-/universalify-2.0.0.tgz#75a4984efedc4b08975c5aeb73f530d02df25717" + integrity sha512-hAZsKq7Yy11Zu1DE0OzWjw7nnLZmJZYTDZZyEFHZdUhV8FkH5MCfoU1XMaxXovpyW5nq5scPqq0ZDP9Zyl04oQ== + unpipe@1.0.0, unpipe@~1.0.0: version "1.0.0" resolved "https://registry.yarnpkg.com/unpipe/-/unpipe-1.0.0.tgz#b2bf4ee8514aae6165b4817829d21b2ef49904ec" diff --git a/database/.prettierrc.js b/database/.prettierrc.js index 8495e3f20..bc1d767d7 100644 --- a/database/.prettierrc.js +++ b/database/.prettierrc.js @@ -5,4 +5,5 @@ module.exports = { trailingComma: "all", tabWidth: 2, bracketSpacing: true, + endOfLine: "auto", }; diff --git a/database/package.json b/database/package.json index f5a16fd31..7a960994c 100644 --- a/database/package.json +++ b/database/package.json @@ -5,7 +5,7 @@ "main": "src/index.ts", "repository": "https://github.com/gradido/gradido/database", "author": "Ulf Gebhardt", - "license": "MIT", + "license": "Apache-2.0", "private": false, "scripts": { "build": "mkdir -p build/src/config/ && cp src/config/*.txt build/src/config/ && tsc --build", diff --git a/docu/Concepts/BusinessRequirements/graphics/Creation_Flowchart.drawio b/docu/Concepts/BusinessRequirements/graphics/Creation_Flowchart.drawio new file mode 100644 index 000000000..4c401e10e --- /dev/null +++ b/docu/Concepts/BusinessRequirements/graphics/Creation_Flowchart.drawio @@ -0,0 +1 @@ +7VxbU9s6EP41eaTju5NHEug5D7TDlHZanhhhK45AtjKyQpL++rNy5PgiJ4Rgx4c2M8xgrVe2tN/u6vNKMLAn8eofjuazLyzEdGAZ4WpgXw0syzQNC35JyXojGXqjjSDiJFRKheCO/MZKaCjpgoQ4rSgKxqgg86owYEmCA1GRIc7Zsqo2ZbT61jmKsCa4CxDVpT9JKGZqFq5RyP/FJJqJ7YTVnRjlykqQzlDIliWRfT2wJ5wxsbmKVxNMpfFyu2z6fd5xdzswjhNxSIcoffphXUeTWCx/jtyft/OXr+GFQucF0YWa8MDyKDxvPIMXeJG8+kzZMpghLkARUA5JyOBqwjEShCVp3gHeXPRRUxbr3I4Cr7KnipiCwITLVHD2jCeMMg6ShCWgOZ4SSmuidI4CkkQgcIvWdzYHwQVM2R4vZ0TgO5DLVy3BDUHGXjCf0szcMxKGOAEZZ4skxNIaxnaEoAYj22lRc4sTODhmMRZ8DSqqg+UoaJVv26q5LBzF9pVsVnKSvBtSvhltn1zABxcKwTeg6Tqa3XEI7qyajIsZi1iC6HUhrZml0LlhmY0lVE9YiLWKTbQQrAokXhHxS3b/5KrWfenO1Uo9OWusK9aXg9tve5gLW/AA75m0rbIC4hEWe/RMuxlMjik48kt1IK1DY2uBNmGJ4ORxoYKohlsVlSYXLyHQhi8bw4ovm5buzFtZ2Zm9zpzZ0IxyUmf2D/dmsDFf/yo37ot4kM2iW9bqIAqcU0VB1vWSc7QuKcwZSURaevKtFBTeZTpuLVPW1qqavm0N9+nDxWYEhXdtp/IOh7M/Svb8QP62AbKvrOuZR0AKRG0u704pXl1KCtkuzNYxOH8yHLOM9QVIzNfgzlq3mBMwJub9+UCvLuBbfUR1D1b2jXea+ajMblcpsOm+ktir6q7KuZ3mdWcv9YI7NyR57p+BudUl8n9AwNx+18NjEmUlSZ50QXRPRcDeh6n34TjOB4DU7ZXjHENbO+Y4x3w71XD2rL7pzaHwe1avEa2tbpdhTJLelzPPqzGD3tezoWap7xwlKQqaKzDpksQUbaqSwBfyeJLWC2aEhjdozRZyzKlAwXPeGs8YJ79BHxV1TsSFihnbqGjcyZ7qmRynoHOb29esib6gVUXxBqUiHw2jFM1T8piNT3aMwWVJMmZCsLgSF20SFL+h3Ombrg6oae9BVL3tGw4ESiKYQUEVvdr7hg0OZDQ4UP11iEJ6SJDAYxkCqeZHLVDMkeZaonCtB7GGXFt3L1UQP7QKjiiJEmhSPJXdJJAkQPRSiWMShpvEvqmP32RqV04h+aYM5OyojiuiD2Nzx/ADNpzIxdaFsU6gbRZt+JHqXACNhuEjkrkPBodcYumUsJAIJNDjNlwO8r23fDnurrc3OuA+/3tXRsk3qUq4L1LMH6DTGe7W4fZ6h9vU4A5KX5IPFL4jz9h3gv3oQOw721rzjtla64ZlF0XBLUO+rxDkzj+MTOtDUON8mAdvuv1ZlC8kHGcERIVsOyywvjWzDbkyLRs1peW8Y/s465ur52X4HanY3BG1u2lXM+CdrcN6RbeyDofA9c/Id4B8AwM7LfKWXu0I5DkkHD4g0THkQq7Px+NdB6lF/DvDu4F1NeJteV3h7Wl4xzhmZ6TbRto0vb6h9jWoUQxc+RzW7YPtWD2Dbep5PGYh5kiwM23rCnVv2PfqrVfge6if/I0BPzoQ+u6yu14hB+inhMdn6tYJ5PBd3jPktl4cLyB/XJ8hbx1yt2+67uh0vZLgT7AR9jfi7vfN3R2du3P8hANxjvSOKnB9E3jvmHNknW+HmP7gDRsiXZ8f8g7dJen3iLx/BJItojbo63TfwfBsstvpD1ZX91k857WT1VV9Vx3X6fRodW7DUuK/xzApQ77XMr7qlTp4KJmn0oe6P4JWP0HkmQ0EqekI2rCzXWT9C/hHivmfvhPZApZm/S8HjIZPWqcBy86OE/qN3zfn0yCt1LK8N5/8agS/s1KWr285k2m2JkQppDWOd+J+xF/A5y4QgM0kUdntBE05tbpWZxBtzgRYfkuxuT0Hl6PTdFLTbAceaBb/MmGzjBX/eMK+/g8= \ No newline at end of file diff --git a/docu/Concepts/BusinessRequirements/image/Creation_flowchart.drawio.png b/docu/Concepts/BusinessRequirements/image/Creation_flowchart.drawio.png new file mode 100644 index 000000000..1e5b21d75 Binary files /dev/null and b/docu/Concepts/BusinessRequirements/image/Creation_flowchart.drawio.png differ diff --git a/docu/Concepts/TechnicalRequirements/BusinessEventProtocol.md b/docu/Concepts/TechnicalRequirements/BusinessEventProtocol.md new file mode 100644 index 000000000..5a436d057 --- /dev/null +++ b/docu/Concepts/TechnicalRequirements/BusinessEventProtocol.md @@ -0,0 +1,81 @@ +# Business Event Protocol + +With the business event protocol the gradido application will capture and persist business information for future reports and statistics. The idea is to design and implement general functionality to capture and store business events. Each business event will be defined as a separate event type with its own business attributes. Each event type extends a basic event type to ensure a type safetiness with its mandatory and optional attributes. + +## EventType - Enum + +The different event types will be defined as Enum. The following list is a first draft and will grow with further event types in the future. + +| EventType | Value | Description | +| --------------------------- | ----- | ---------------------------------------------------------------------------------------------------- | +| BasicEvent | 0 | the basic event is the root of all further extending event types | +| VisitGradidoEvent | 10 | if a user visits a gradido page without login or register | +| RegisterEvent | 20 | the user presses the register button | +| RedeemRegisterEvent | 21 | the user presses the register button initiated by the redeem link | +| InActiveAccountEvent | 22 | the systems create an inactive account during the register process | +| SendConfirmEmailEvent | 23 | the system send a confirmation email to the user during the register process | +| ConfirmEmailEvent | 24 | the user confirms his email during the register process | +| RegisterEmailKlickTippEvent | 25 | the system registers the confirmed email at klicktipp | +| LoginEvent | 30 | the user presses the login button | +| RedeemLoginEvent | 31 | the user presses the login button initiated by the redeem link | +| ActivateAccountEvent | 32 | the system activates the users account during the first login process | +| PasswordChangeEvent | 33 | the user changes his password | +| TxSendEvent | 40 | the user creates a transaction and sends it online | +| TxSendRedeemEvent | 41 | the user creates a transaction and sends it per redeem link | +| TxRepeateRedeemEvent | 42 | the user recreates a redeem link of a still open transaction | +| TxCreationEvent | 50 | the user receives a creation transaction for his confirmed contribution | +| TxReceiveEvent | 51 | the user receives a transaction from an other user and posts the amount on his account | +| TxReceiveRedeemEvent | 52 | the user activates the redeem link and receives the transaction and posts the amount on his account | +| ContribCreateEvent | 60 | the user enters his contribution and asks for confirmation | +| ContribConfirmEvent | 61 | the user confirms a contribution of an other user (for future multi confirmation from several users) | +| | | | + +## EventProtocol - Entity + +The business events will be stored in database in the new table `EventProtocol`. The tabel will have the following attributes: + +| Attribute | Type | Description | +| ------------- | --------- | ------------------------------------------------------------------------------------------------ | +| id | int | technical unique key (from db sequence) | +| type | enum | type of event | +| createdAt | timestamp | timestamp the event occurs (not the time of writing) | +| userID | string | the user ID, who invokes the event | +| XuserID | string | the cross user ID, who is involved in the process like a tx-sender, contrib-receiver, ... | +| XcommunityID | string | the cross community ID, which is involved in the process like a tx-sender, contrib-receiver, ... | +| transactionID | int | the technical key of the transaction, which triggers the event | +| contribID | int | the technical key of the contribution, which triggers the event | +| amount | digital | the amount of gradido transferred by transaction, creation or redeem | + +## Event Types + +The following table lists for each event type the mandatory attributes, which have to be initialized at event occurence and to be written in the database event protocol table: + +| EventType | id | type | createdAt | userID | XuserID | XCommunityID | transactionID | contribID | amount | +| :-------------------------- | :-: | :--: | :-------: | :----: | :-----: | :----------: | :-----------: | :-------: | :----: | +| BasicEvent | x | x | x | | | | | | | +| VisitGradidoEvent | x | x | x | | | | | | | +| RegisterEvent | x | x | x | x | | | | | | +| RedeemRegisterEvent | x | x | x | x | | | | | | +| InActiveAccountEvent | x | x | x | x | | | | | | +| SendConfirmEmailEvent | x | x | x | x | | | | | | +| ConfirmEmailEvent | x | x | x | x | | | | | | +| RegisterEmailKlickTippEvent | x | x | x | x | | | | | | +| LoginEvent | x | x | x | x | | | | | | +| RedeemLoginEvent | x | x | x | x | | | | | | +| ActivateAccountEvent | x | x | x | x | | | | | | +| PasswordChangeEvent | x | x | x | x | | | | | | +| TxSendEvent | x | x | x | x | x | x | x | | x | +| TxSendRedeemEvent | x | x | x | x | x | x | x | | x | +| TxRepeateRedeemEvent | x | x | x | x | x | x | x | | x | +| TxCreationEvent | x | x | x | x | | | x | | x | +| TxReceiveEvent | x | x | x | x | x | x | x | | x | +| TxReceiveRedeemEvent | x | x | x | x | x | x | x | | x | +| ContribCreateEvent | x | x | x | x | | | | x | | +| ContribConfirmEvent | x | x | x | x | x | x | | x | | +| | | | | | | | | | | + +## Event creation + +The business logic needs a *general event creation* service/methode, which accepts as input one of the predefined event type objects. An event object have to be initialized with its mandatory attributes before it can be given as input parameters for event creation. The service maps the event object attributes to the database entity and writes a new entry in the `EventProtocol `table. + +At each specific location of the gradido business logic an event creation invocation has to be introduced manually, which matches the corresponding event type - see [EventType-Enum](#EventType-Enum) above. diff --git a/docu/Concepts/TechnicalRequirements/Federation.md b/docu/Concepts/TechnicalRequirements/Federation.md index 2f4ffc0f9..959aa8afe 100644 --- a/docu/Concepts/TechnicalRequirements/Federation.md +++ b/docu/Concepts/TechnicalRequirements/Federation.md @@ -1,8 +1,10 @@ # Federation -This document contains the concept and technical details for the *federation* of gradido communities. It base on the [ActivityPub specification](https://www.w3.org/TR/activitypub/ " ") and is extended for the gradido requirements. +This document contains the concept and technical details for the *federation* of gradido communities. The first idea of federation was to base on the [ActivityPub specification](https://www.w3.org/TR/activitypub/ " ") and extend it for the gradido requirements. -## ActivityPub +But meanwhile the usage of a DHT like HyperSwarm promises more coverage of the gradido requirements out of the box. More details about HyperSwarm can be found here [@hyperswarm/dht](https://github.com/hyperswarm/dht). + +## ActivityPub (deprecated) The activity pub defines a server-to-server federation protocol to share information between decentralized instances and will be the main komponent for the gradido community federation. @@ -25,8 +27,387 @@ The Variant A with an internal server contains the benefit to be as independent The Varaint B with an external server contains the benefit to reduce the implementation efforts and the responsibility for an own ActivitPub-Server. But it will cause an additional dependency to a third party service provider and the growing hosting costs. +## HyperSwarm +The decision to switch from ActivityPub to HyperSwarm base on the arguments, that the *hyperswarm/dht* library will satify the most federation requirements out of the box. It is now to design the business requirements of the [gradido community communication](../BusinessRequirements/CommunityVerwaltung.md#UC-createCommunity) in a technical conception. -## ActivityStream +The challenge for the decentralized communities of gradido will be *how to become a new community aquainted with an existing community* ? -An ActivityStream includes all definitions and terms needed for community activities and content flow around the gradido community network. +To enable such a relationship between an existing community and a new community several stages has to run through: + +1. Federation + * join&connect + * direct exchange +2. Authentication +3. Autorized Communication + +### Overview + +At first the following diagramm gives an overview of the three stages and shows the handshake between an existing community-A and a new created community-B including the data exchange for buildup such a federated, authenticated and autorized relationship. + +![FederationHyperSwarm.png](./image/FederationHyperSwarm.png) + +### Prerequisits + +Before starting in describing the details of the federation handshake, some prerequisits have to be defined. + +#### Database + +With the federation additional data tables/entities have to be created. + +##### Community-Entity + +Create the new *Community* table to store attributes of the own community. This table is used more like a frame for own community data in the future like the list of federated foreign communities, own users, own futher accounts like AUF- and Welfare-account and the profile data of the own community: + +| Attributes | Type | Description | +| ----------- | ------ | ---------------------------------------------------------------------------------------------------------------------------------------------------------- | +| id | int | technical unique key of this entity | +| uuid | string | unique key for a community, which will never changed as long as the community exists | +| name | string | name of the community shown on UI e.g for selection of a community | +| description | string | description of the community shown on UI to present more community details | +| ... | | for the near future additional attributes like profile-info, trading-level, set-of-rights,... will follow with the next level of Multi-Community Readyness | + +##### CommunityFederation-Entity + +Create the new *CommunityFederation* table to store at this point of time only the attributes used by the federation handshake: + +| Attributes | Type | Description | +| ---------------- | --------- | ------------------------------------------------------------------------------------------------------ | +| id | int | technical unique key of this entity | +| uuid | string | unique key for a community, which will never changed as long as the community exists | +| foreign | boolean | flag to mark the entry as a foreign or own community entry | +| createdAt | timestamp | the timestamp the community entry was created | +| privateKey | string | the private key of the community for asynchron encryption (only set for the own community) | +| pubKey | string | the public key of the community for asynchron encryption | +| pubKeyVerifiedAt | timestamp | the timestamp the pubkey of this foreign community is verified (for the own community its always null) | +| authenticatedAt | timestamp | the timestamp of the last successfull authentication with this foreign community | +| | | for the near future additional attributes will follow with the next level of Multi-Community Readyness | + +##### CommunityApiVersion-Entity + +Create the new *CommunityApiVersion* table to support several urls and apiversion of one community at once. It references the table *CommunityFederation* with the foreignkey *communityFederationID* (naming pattern foreignkea = `ID`) for a 1:n relationship + +| Attributes | Type | Description | +| --------------------- | --------- | ------------------------------------------------------------------------------------------------------------------ | +| id | int | technical unique key of this entity | +| communityFederationID | int | the technical foreign key to the community entity | +| url | string | the URL the community will provide its services, could be changed during lifetime of the community | +| apiversion | string | the API version the community will provide its services, will increase with each release | +| validFrom | timestamp | the timestamp as of the url and api are provided by the community | +| verifiedAt | timestamp | the timestamp the url and apiversion of this foreign community is verified (for the own community its always null) | + +#### Configuration + +The preparation of a community infrastructure needs some application-outside configuration, which will be read during the start phase of the gradido components. The configuration will be defined in a file based manner like key-value-pairs as properties or in a structured way like xml or json files. + +| Key | Value | Default-Value | Description | +| ----------------------------------------------- | :------------------------------------ | --------------------- | ------------------------------------------------------------------------------------------------- | +| stage.name | dev
stage1
stage2
prod | dev | defines the name of the stage this instance will serve | +| stage.host | | | the name of the host or ip this instance will run | +| stage.mode | test
prod | test | the running mode this instance will work | +| federation.communityname | | Gradido-Akademie | the name of this community | +| federation.apiversion | `` | current 1.7 | defines the current api version this instance will provide its services | +| federation.apiversion.`.`url | |
gdd.gradido.net |
defines the url on which this instance of a community will provide its services | +| federation.apiversion.`.`validFrom | | | defines the timestamp the apiversion is or will be valid | +| federation.dhtnode.topic | | dht_gradido_topic | defines the name of the federation topic, which is used to join and connect as federation-channel | +| federation.dhtnode.host | | | defines the host where the DHT-Node is hosted, if outside apollo | +| federation.dhtnode.port | | | defines the port on which the DHT-node will provide its services, if outside apollo | + +#### 1st Start of a community + +The first time a new community infrastructure on a server is started, the start-phase has to check and prepair the own community database for federation. That means the application has to read the configuration and check against the database, if all current configured data is propagated in the database especially in the *CommunityXXX* entities. + +* check if the*Community* table is empty or if an exisiting community entry is not equals the configured values, then update as follows: + + * community.id = next sequence value + * community.uuid = generated UUID (version 4) + * community.name = Configuration.federation.communityname + * community.description = null +* prepare the *CommunityFederation* table + + * communityFederation.id = next sequence value + * communityFederation.uuid = community.uuid + * communityFederation.foreign = FALSE + * communityFederation.createdAt = NOW + * communityFederation.privateKey = null + * communityFederation.pubKey = null + * communityFederation.pubKeyVerifiedAt = null + * communityFederation.authenticatedAt = null +* prepare the *CommunityApiVersion* table with all configured apiversions: + + * communityApiVersion.id = next sequence value + * communityApiVersion.communityFederationID = communityFederation.id + * communityApiVersion.url = Configuration.federation.apiversion.``.url + * communityApiVersion.apiversion = Configuration.federation.apiversion + * communityApiVersion.validFrom = Configuration.federation.apiversion.``.validFrom + * communityApiVersion.verifiedAt = null + +### Stage1 - Federation + +For the 1st stage the *hyperswarm dht library* will be used. It supports an easy way to connect a new community with other existing communities. As shown in the picture above the *hyperswarm dht library* will be part of the component *DHT-Node* separated from the *apollo server* component. The background for this separation is to keep off the federation activity from the business processes or to enable component specific scaling in the future. In consequence for the inter-component communication between *DHT-Node*, *apollo server* and other components like *database* the interface and security has to be defined during development on using technical standards. + +For the first federation release the *DHT-Node* will be part of the *apollo server*, but internally designed and implemented as a logical saparated component. + +#### Sequence join&connect + +1. In the own database of community_A the entites *Community*, *CommunityFederation* and *CommunityApiVersion* are initialized +2. When starting the *DHT-Node* of community_A it search per *apollo-ORM* for the own community entry and check on existing keypair *CommunityFederation.pubKey* and *CommunityFederation.privateKey* in the database. If they not exist, the *DHT-Node* generates the keypair *pubkey* and *privatekey* and writes them per *apollo-ORM* in the database +3. For joining with the correct channel of *hyperswarm dht* a topic has to be used. The *DHT-Node* reads the configured value of the property *federation.dhtnode.topic*. +4. with the *CommunityFederation.pubKey* and the *federation.dhtnode.topic* the *DHT-Node* joins the *hyperswarm dht* and listen for other *DHT-nodes* on the topic. +5. As soon as a the *hyperswarm dht* notifies an additional node in the topic, the *DHT-node* reads the *pubKey* of this additional node and search it per *apollo-ORM* in the *CommunityFederation* table by filtering with *CommunityFederation.foreign* = TRUE +6. if an entry with the C*ommunityFederation.pubKey* of the foreign node still exists and the *CommunityFederation.pubKeyVerifiedAt* is not NULL both the *DHT-node* and the foreign node had pass through the federation process before. Nevertheless the following steps and stages have to be processed for updating e.g the api versions or other meanwhile changed date. +7. if an entry with the *CommunityFederation.pubKey* of the additional node can't be found, the *DHT-Node* starts with the next step *direct exchange* of the federation handshake anyway. + +#### Sequence direct exchange + +1. if the *CommunityFederation.pubKey* of the additional node does not exists in the *CommunityFederation* table the *DHT-node* starts a *direct exchange* with this foreign node to gets the data the first time, otherwise to update previous exchanged data. +2. the *DHT-node* opens a direct connection per *hyperswarm* with the additional node and exchange respectively the *url* and *apiversion* between each other. + 1. to support the future feature that one community can provide several urls and apiversion at once the exchanged data should be in a format, which can represent structured information, like JSON (or simply CSV - feel free how to implement, but be aware about security aspects to avoid possible attacks during parsing the exchanged data and the used parsers for it) + + ``` + { + "API": + { + "url" : "comB.com", + "version" : "1.0", + "validFrom" : "2022.01.01" + } + "API" : + { + "url" : "comB.com", + "version" : "1.1", + "validFrom" : "2022.04.15" + } + "API" : + { + "url" : "comB.de", + "version" : "2.0", + "validFrom" : "2022.06.01" + } + } + ``` + 2. the *DHT-Node* writes per *apollo-ORM* the received and parsed data from the foreign node in the database + + 1. For the future an optimization step will be introduced here to avoid possible attacks of a foreign node by polute our database with mass of data. + + 1. before the *apollo-ORM* writes the data in the database, the *apollo-graphQL* invokes for all received urls and apiversions at the foreign node the request https.//`//getPubKey()`. + 2. Normally the foreign node will response in a very short time with its publicKey, because there will be nothing to en- or decrypt or other complex processing steps. + 3. if such a request runs in a timeout anyhow, the previous exchanged data with the foreign node will be almost certainly a fake and can be refused without storing in database. Break the further federation processing steps and stages and return back to stage1 join&connect. + 4. if the response is in time the received publicKey must be equals with the pubKey of the foreign node the *DHT-Node* gets from *hyperswarm dht* per topic during the join&connect stage before + 5. if both keys are the same, the writing of the exchanged data per *apollo-ORM* can go on. + 6. if both keys will not match the exchanged data during the direct connection will be almost certainly a fake and can be refused without storing in database. Break the further federation processing steps and stages and return back to stage1 join&connect. + 2. the *apollo-ORM* inserts / updates or deletes the received data as follow + + * insert/update in the *CommunityFederation* table for this foreign node: + + | Column | insert | update | + | ------------------------------------ | -------------------- | :------------------ | + | communityFederation.id | next sequence value | keep existing value | + | communityFederation.uuid | null | keep existing value | + | communityFederation.foreign | TRUE | keep existing value | + | communityFederation.createdAt | NOW | keep existing value | + | communityFederation.privateKey | null | keep existing value | + | communityFederation.pubKey | exchangedData.pubKey | keep existing value | + | communityFederation.pubKeyVerifiedAt | null | keep existing value | + | communityFederation.authenticatedAt | null | keep existing value | + * for each exchangedData API + + if API not exists in database then insert in the *CommunityApiVersion* table: + + | Column | insert | + | ----------------------------------------- | --------------------------- | + | communityApiVersion.id | next sequence value | + | communityApiVersion.communityFederationID | communityFederation.id | + | communityApiVersion.url | exchangedData.API.url | + | communityApiVersion.apiversion | exchangedData.API.version | + | communityApiVersion.validFrom | exchangedData.API.validFrom | + | communityApiVersion.verifiedAt | null | + + if API exists in database but was not part of the last data exchange, then delete it from the *CommunityApiVersion* table + + if API exists in database and was part of the last data exchange, then update it in the *CommunityApiVersion* table + + | Column | update | + | ----------------------------------------- | --------------------------- | + | communityApiVersion.id | keep existing value | + | communityApiVersion.communityFederationID | keep existing value | + | communityApiVersion.url | keep existing value | + | communityApiVersion.apiversion | keep existing value | + | communityApiVersion.validFrom | exchangedData.API.validFrom | + | communityApiVersion.verifiedAt | keep existing value | + * + 3. After all received data is stored successfully, the *DHT-Node* starts the *stage2 - Authentication* of the federation handshake + +### Stage2 - Authentication + +The 2nd stage of federation is called *authentication*, because during the 1st stage the *hyperswarm dht* only ensures the knowledge that one node is the owner of its keypairs *pubKey* and *privateKey*. The exchanged data between two nodes during the *direct exchange* on the *hyperswarm dht channel* must be verified, means ensure if the proclaimed *url(s)* and *apiversion(s)* of a node is the correct address to reach the same node outside the hyperswarm infrastructure. + +As mentioned before the *DHT-node* invokes the *authentication* stage on *apollo server* *graphQL* with the previous stored data of the foreign node. + +#### Sequence - view of existing Community + +1. the authentication stage starts by reading for the *foreignNode* from the previous federation step all necessary data + 1. select with the *foreignNode.pubKey* from the tables *CommunityFederation* and *CommunityApiVersion* where *CommunityApiVersion.validFrom* <= NOW and *CommunityApiVersion.verifiedAt* = null + 2. the resultSet will be a list of data with the following attributes + * foreignNode.pubKey + * foreignNode.url + * foreignNode.apiVersion +2. read the own keypair and uuid by `select uuid, privateKey, pubKey from CommunityFederation cf where cf.foreign = FALSE` +3. for each entry of the resultSet from step 1 do + 1. encryptedURL = encrypting the *foreignNode.url* and *foreignNode.apiVersion* with the *foreignNode.pubKey* + 2. signedAndEncryptedURL = sign the result of the encryption with the own *privateKey* + 3. invoke the request `https:////openConnection(own.pubKey, signedAndEncryptedURL )` + 4. the foreign node will response immediately with an empty response OK, otherwise break the authentication stage with an error +4. the foreign node will process the request on its side - see [description below](#Sequence - view of new Community) - and invokes a redirect request base on the previous exchanged data during stage1 - Federation. This could be more than one redirect request depending on the amount of supported urls and apiversions we propagate to the foreignNode before. + 1. if the other community will not react with an `openConnectionRedirect`-request, ther will be an error like missmatching data and the further federation processing will end and go back to join&connect. +5. for each received request `https:////openConnectionRedirect(onetimecode, foreignNode.url, encryptedRedirectURL )` do + 1. with the given parameter the following steps will be done + 1. search for the *foreignNode.pubKey* by `select cf.pubKey from CommunityApiVersion cav, CommunityFederation cf where cav.url = foreignNode.url and cav.communityFederationID = cf.id` + 2. decrypt with the `own.privateKey` the received `encryptedRedirectURL` parameter, which contains a full qualified url inc. apiversion and route + 3. verify signature of `encryptedRedirectURL` with the previous found *foreignNode.pubKey* from the own database + 4. if the decryption and signature verification are successful then encrypt the *own.uuid* with the *own.privateKey* to *encryptedOwnUUID* + 5. invoke the redirect request with https://`(onetimecode, encryptedOwnUUID)` and + 6. wait for the response with the `encryptedForeignUUID` + 7. decrypt the `encrpytedForeignUUID` with the *foreignNode.pubKey* + 8. write the encrypted *foreignNode.UUID* in the database by updating the CommunityFederation table per `update CommunityFederation cf set values (cf.uuid = foreignNode.UUID, cf.pubKeyVerifiedAt = NOW) where cf.pubKey = foreignNode.pubkey` + +After all redirect requests are process, all relevant authentication data of the new community are well know here and stored in the database. + +#### Sequence - view of new Community + +This chapter contains the description of the Authentication Stage on the new community side as the request `openConnection(pubKey, signedAndEncryptedURL)` + +As soon the *openConnection* request is invoked: + +1. decrypted the 2nd `parameter.signedAndEncryptedURL` with the own *privatKey* +2. with the 1st parameter *pubKey* search in the own database `select uuid, url, pubKey from CommunityFederation cf where cf.foreign = TRUE and cf.pubKey = parameter.pubKey` +3. check if the decrypted `parameter.signedAndEncryptedURL` is equals the selected url from the previous selected CommunityFederationEntry + 1. if not then break the further processing of this request by only writing an error-log event. There will be no answer to the invoker community, because this community will only go on with a `openConnectionRedirect`-request from this community. + 2. if yes then verify the signature of `parameter.signedAndEncryptedURL` with the `cf.pubKey` read in step 2 before + 3. +4. + +### Stage3 - Autorized Business Communication + +ongoing + +# Review von Ulf + +## Communication concept + +The communication happens in 4 stages. + +- Stage1: Federation +- Stage2: Direct-Connection +- Stage3: GraphQL-Verification +- Stage4: GraphQL-Content + +### Stage1 - Federation + +Using the hyperswarm dht library we can find eachother easily and exchange a pubKey and data of which we know that the other side owns the private key of. + +``` +ComA ---- announce ----> DHT +ComB <--- listen ------- DHT +``` + +Each peer will know the `pubKey` of the other participants. Furthermore a direct connection is possible. + +``` +ComB ---- connect -----> ComA +ComB ---- data --------> ComA +``` + +### Stage2 - Direct-Connection + +The hyperswarm dht library offers a secure channel based on the exchanged `pubKey` so we do not need to verify things. + +The Idea is now to exchange the GraphQL Endpoints and their corresponding versions API versions in form of json + +``` +{ + "API": { + "1.0": "https://comB.com/api/1.0/", + "1.1": "https://comB.com/api/1.1/", + "2.4": "https://comB.de/api/2.4/" + } +} +``` + +### Stage3 - GraphQL-Verification + +The task of Stage3 is to verify that the collected data through the two Federation Stages are correct, since we did not verify yet that the proclaimed URL is actually the guy we talked to in the federation. Furthermore the sender must be verified to ensure the queried community does not reveal things to a third party not authorized. + +``` +ComA ----- verify -----> ComB +ComA <---- authorize --- ComB +``` + +Assuming this Dataset on ComA after a federation (leaving out multiple API endpoints to simplify things): + +``` +| PubKey | API-Endpoint | PubKey Verified On | +|--------|---------------|--------------------| +| PubA* | ComA.com/api/ | NULL | +| PubB | ComB.com/api/ | NULL | +| PubC | ComB.com/api/ | NULL | + +* = self +``` + +using the GraphQL Endpoint to query things: + +``` +ComA ---- getPubKey ---> ComB.com +ComA <--- PubB --------- ComB.com + +ComA UPDATE database SET pubKeyVerifiedOn = now WHERE API-Endpoint=queryURL AND PubKey=QueryResult +``` + +resulting in: + +``` +| PubKey | API-Endpoint | PubKey Verified On | +|--------|---------------|--------------------| +| PubA* | ComA.com/api/ | 1.1.1970 | +| PubB | ComB.com/api/ | NOW | +| PubC | ComB.com/api/ | NULL | +``` + +Furthermore we use the Header to transport a proof of who the caller is when calling and when answering: + +``` +ComA ---- getPubKey, sign({pubA, crypt(timeToken,pubB)},privA) --> ComB.com +ComB: is pubA known to me? +ComB: is the signature correct? +ComB: can I decrypt payload? +ComB: is timeToken <= 10sec? +ComA <----- PubB, sign({timeToken}, privB) ----------------------- ComB.com +ComA: is timeToken correct? +ComA: is signature correct? +``` + +This process we call authentication and can result in several errors: + +1. Your pubKey was not known to me +2. Your signature is incorrect +3. I cannot decrypt your payload +4. Token Timeout (recoverable) +5. Result token was incorrect +6. Result signature was incorrect + +``` +| PubKey | API-Endpoint | PubKey Verified On | AuthenticationLastSuccess | +|--------|---------------|--------------------|----------------------------| +| PubA* | ComA.com/api/ | 1.1.1970 | 1.1.1970 | +| PubB | ComB.com/api/ | NOW | NOW | +| PubC | ComB.com/api/ | NULL | NULL | +``` + +The next process is the authorization. This happens on every call on the receiver site to determine which call is allowed for the other side. + +``` +ComA ---- getPubKey, sign({pubA, crypt(timeToken,pubB)},privA) --> ComB.com +ComB: did I verify pubA? SELECT PubKeyVerifiedOn FROm database WHERE PubKey = pubA +ComB: is pubA allowed to query this? +``` diff --git a/docu/Concepts/TechnicalRequirements/UC_Introduction_of_Gradido-ID.md b/docu/Concepts/TechnicalRequirements/UC_Introduction_of_Gradido-ID.md new file mode 100644 index 000000000..e3c0ac2d7 --- /dev/null +++ b/docu/Concepts/TechnicalRequirements/UC_Introduction_of_Gradido-ID.md @@ -0,0 +1,140 @@ +# Introduction of Gradido-ID + +## Motivation + +To introduce the Gradido-ID base on the requirement to identify an user account per technical key instead of using an email-address. Such a technical key ensures an exact identification of an user account without giving detailed information for possible missusage. + +Additionally the Gradido-ID allows to administrade any user account data like changing the email address or define several email addresses without any side effects on the identification of the user account. + +## Definition + +The formalized definition of the Gradido-ID can be found in the document [BenutzerVerwaltung#Gradido-ID](../BusinessRequirements/BenutzerVerwaltung#Gradido-ID). + +## Steps of Introduction + +To Introduce the Gradido-ID there are several steps necessary. The first step is to define a proper database schema with additional columns and tables followed by data migration steps to add or initialize the new columns and tables by keeping valid data at all. + +The second step is to decribe all concerning business logic processes, which have to be adapted by introducing the Gradido-ID. + +### Database-Schema + +#### Users-Table + +The entity users has to be changed by adding the following columns. + +| Column | Type | Description | +| ------------------------ | ------ | -------------------------------------------------------------------------------------- | +| gradidoID | String | technical unique key of the user as UUID (version 4) | +| alias | String | a business unique key of the user | +| passphraseEncryptionType | int | defines the type of encrypting the passphrase: 1 = email (default), 2 = gradidoID, ... | +| emailID | int | technical foreign key to the new entity Contact | + +##### Email vs emailID + +The existing column `email`, will now be changed to the primary email contact, which will be stored as a contact entry in the new `UserContacts` table. It is necessary to decide if the content of the `email `will be changed to the foreign key `emailID `to the contact entry with the email address or if the email itself will be kept as a denormalized and duplicate value in the `users `table. + +The preferred and proper solution will be to add a new column `Users.emailId `as foreign key to the `UsersContact `entry and delete the `Users.email` column after the migration of the email address in the `UsersContact `table. + +#### new UserContacts-Table + +A new entity `UserContacts `is introduced to store several contacts of different types like email, telephone or other kinds of contact addresses. + +| Column | Type | Description | +| --------------- | ------ | ---------------------------------------------------------------------------------------------------------------------------------------------------------------------- | +| id | int | the technical key of a contact entity | +| type | int | Defines the type of contact entry as enum: Email, Phone, etc | +| usersID | int | Defines the foreign key to the `Users` table | +| email | String | defines the address of a contact entry of type Email | +| phone | String | defines the address of a contact entry of type Phone | +| contactChannels | String | define the contact channel as comma separated list for which this entry is confirmed by the user e.g. main contact (default), infomail, contracting, advertisings, ... | + +### Database-Migration + +After the adaption of the database schema and to keep valid consistent data, there must be several steps of data migration to initialize the new and changed columns and tables. + +#### Initialize GradidoID + +In a one-time migration create for each entry of the `Users `tabel an unique UUID (version4). + +#### Primary Email Contact + +In a one-time migration read for each entry of the `Users `table the `Users.id` and `Users.email` and create for it a new entry in the `UsersContact `table, by initializing the contact-values with: + +* id = new technical key +* type = Enum-Email +* userID = `Users.id` +* email = `Users.email` +* phone = null +* usedChannel = Enum-"main contact" + +and update the `Users `entry with `Users.emailId = UsersContact.Id` and `Users.passphraseEncryptionType = 1` + +After this one-time migration the column `Users.email` can be deleted. + +### Adaption of BusinessLogic + +The following logic or business processes has to be adapted for introducing the Gradido-ID + +#### Read-Write Access of Users-Table especially Email + +The ORM mapping has to be adapted to the changed and new database schema. + +#### Registration Process + +The logic of the registration process has to be adapted by + +* initializing the `Users.userID` with a unique UUID +* creating a new `UsersContact `entry with the given email address and *maincontact* as `usedChannel ` +* set `emailID `in the `Users `table as foreign key to the new `UsersContact `entry +* set `Users.passphraseEncrpytionType = 2` and encrypt the passphrase with the `Users.userID` instead of the `UsersContact.email` + +#### Login Process + +The logic of the login process has to be adapted by + +* search the users data by reading the `Users `and the `UsersContact` table with the email (or alias as soon as the user can maintain his profil with an alias) as input +* depending on the `Users.passphraseEncryptionType` decrypt the stored password + * = 1 : with the email + * = 2 : with the userID + +#### Password En/Decryption + +The logic of the password en/decryption has to be adapted by encapsulate the logic to be controlled with an input parameter. The input parameter can be the email or the userID. + +#### Change Password Process + +The logic of change password has to be adapted by + +* if the `Users.passphraseEncryptionType` = 1, then + + * read the users email address from the `UsersContact `table + * give the email address as input for the password decryption of the existing password + * use the `Users.userID` as input for the password encryption fo the new password + * change the `Users.passphraseEnrycptionType` to the new value =2 +* if the `Users.passphraseEncryptionType` = 2, then + + * give the `Users.userID` as input for the password decryption of the existing password + * use the `Users.userID` as input for the password encryption fo the new password + +#### Search- and Access Logic + +A new logic has to be introduced to search the user identity per different input values. That means searching the user data must be possible by + +* searching per email (only with maincontact as contactchannel) +* searching per userID +* searching per alias + +#### Identity-Mapping + +A new mapping logic will be necessary to allow using unmigrated APIs like GDT-servers api. So it must be possible to give this identity-mapping logic the following input to get the respective output: + +* email -> userID +* email -> alias +* userID -> email +* userID -> alias +* alias -> email +* alias -> userID + +#### GDT-Access + +To use the GDT-servers api the used identifier for GDT has to be switch from email to userID. diff --git a/docu/Concepts/TechnicalRequirements/graphics/FederationHyperSwarm.drawio b/docu/Concepts/TechnicalRequirements/graphics/FederationHyperSwarm.drawio new file mode 100644 index 000000000..b5f120f52 --- /dev/null +++ b/docu/Concepts/TechnicalRequirements/graphics/FederationHyperSwarm.drawio @@ -0,0 +1,650 @@ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + \ No newline at end of file diff --git a/docu/Concepts/TechnicalRequirements/image/FederationHyperSwarm.png b/docu/Concepts/TechnicalRequirements/image/FederationHyperSwarm.png new file mode 100644 index 000000000..526da1721 Binary files /dev/null and b/docu/Concepts/TechnicalRequirements/image/FederationHyperSwarm.png differ diff --git a/docu/Style/Images/Bild_1_1920.jpg b/docu/Style/Images/Bild_1_1920.jpg new file mode 100644 index 000000000..561e49f57 Binary files /dev/null and b/docu/Style/Images/Bild_1_1920.jpg differ diff --git a/docu/Style/Images/Bild_1_2400.jpg b/docu/Style/Images/Bild_1_2400.jpg new file mode 100644 index 000000000..834ec73df Binary files /dev/null and b/docu/Style/Images/Bild_1_2400.jpg differ diff --git a/docu/Style/Images/Bild_2_1920.jpg b/docu/Style/Images/Bild_2_1920.jpg new file mode 100644 index 000000000..db13280c8 Binary files /dev/null and b/docu/Style/Images/Bild_2_1920.jpg differ diff --git a/docu/Style/Images/Bild_2_2400.jpg b/docu/Style/Images/Bild_2_2400.jpg new file mode 100644 index 000000000..fc66d9bed Binary files /dev/null and b/docu/Style/Images/Bild_2_2400.jpg differ diff --git a/docu/Style/Images/Bild_3_1920.jpg b/docu/Style/Images/Bild_3_1920.jpg new file mode 100644 index 000000000..1ff4fbc9a Binary files /dev/null and b/docu/Style/Images/Bild_3_1920.jpg differ diff --git a/docu/Style/Images/Bild_3_2400.jpg b/docu/Style/Images/Bild_3_2400.jpg new file mode 100644 index 000000000..d81b4c3f4 Binary files /dev/null and b/docu/Style/Images/Bild_3_2400.jpg differ diff --git a/docu/graphics/brainstorm-gradido.drawio b/docu/graphics/brainstorm-gradido.drawio new file mode 100644 index 000000000..c6fb1c9a4 --- /dev/null +++ b/docu/graphics/brainstorm-gradido.drawio @@ -0,0 +1,327 @@ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + \ No newline at end of file diff --git a/docu/other/Contribution/contribution-mockup-konzept.odt b/docu/other/Contribution/contribution-mockup-konzept.odt new file mode 100644 index 000000000..9644062f9 Binary files /dev/null and b/docu/other/Contribution/contribution-mockup-konzept.odt differ diff --git a/docu/other/Contribution/contribution-mockup-konzept.pdf b/docu/other/Contribution/contribution-mockup-konzept.pdf new file mode 100644 index 000000000..14b63d2c0 Binary files /dev/null and b/docu/other/Contribution/contribution-mockup-konzept.pdf differ diff --git a/frontend/.eslintrc.js b/frontend/.eslintrc.js index 4e45ede62..f32eca810 100644 --- a/frontend/.eslintrc.js +++ b/frontend/.eslintrc.js @@ -45,16 +45,15 @@ module.exports = { extensions: ['.js', '.vue'], // TODO: remove ignores ignores: [ - '/site.thx./', '/form./', '/time./', '/decay.types./', 'settings.password.resend_subtitle', + 'settings.password.reset', 'settings.password.reset-password.text', 'settings.password.set', 'settings.password.set-password.text', 'settings.password.subtitle', - 'site.login.signin', ], enableFix: false, }, diff --git a/frontend/.prettierrc.js b/frontend/.prettierrc.js index e88113754..bc1d767d7 100644 --- a/frontend/.prettierrc.js +++ b/frontend/.prettierrc.js @@ -4,5 +4,6 @@ module.exports = { singleQuote: true, trailingComma: "all", tabWidth: 2, - bracketSpacing: true + bracketSpacing: true, + endOfLine: "auto", }; diff --git a/frontend/package.json b/frontend/package.json index 9d70ace58..ae5dca33c 100755 --- a/frontend/package.json +++ b/frontend/package.json @@ -45,7 +45,6 @@ "jest": "^26.6.3", "jest-canvas-mock": "^2.3.1", "jest-environment-jsdom-sixteen": "^2.0.0", - "particles-bg-vue": "1.2.3", "portal-vue": "^2.1.7", "prettier": "^2.2.1", "qrcanvas-vue": "2.1.1", @@ -100,5 +99,6 @@ "not ie <= 10" ], "author": "Gradido-Akademie - https://www.gradido.net/", + "license": "Apache-2.0", "description": "Gradido, the Natural Economy of Life, is a way to worldwide prosperity and peace in harmony with nature. - Gradido, die Natürliche Ökonomie des lebens, ist ein Weg zu weltweitem Wohlstand und Frieden in Harmonie mit der Natur." } diff --git a/frontend/public/img/template/Blaetter.png b/frontend/public/img/template/Blaetter.png index af11b67f2..f54dde670 100644 Binary files a/frontend/public/img/template/Blaetter.png and b/frontend/public/img/template/Blaetter.png differ diff --git a/frontend/public/img/template/Foto_01_2400_small.jpg b/frontend/public/img/template/Foto_01_2400_small.jpg new file mode 100644 index 000000000..834ec73df Binary files /dev/null and b/frontend/public/img/template/Foto_01_2400_small.jpg differ diff --git a/frontend/public/img/template/Foto_02_2400_small.jpg b/frontend/public/img/template/Foto_02_2400_small.jpg new file mode 100644 index 000000000..fc66d9bed Binary files /dev/null and b/frontend/public/img/template/Foto_02_2400_small.jpg differ diff --git a/frontend/public/img/template/Foto_03_2400_small.jpg b/frontend/public/img/template/Foto_03_2400_small.jpg new file mode 100644 index 000000000..d81b4c3f4 Binary files /dev/null and b/frontend/public/img/template/Foto_03_2400_small.jpg differ diff --git a/frontend/src/App.spec.js b/frontend/src/App.spec.js new file mode 100644 index 000000000..79467e2a8 --- /dev/null +++ b/frontend/src/App.spec.js @@ -0,0 +1,61 @@ +import { mount, RouterLinkStub } from '@vue/test-utils' +import App from './App' + +const localVue = global.localVue +const mockStoreCommit = jest.fn() + +const stubs = { + RouterLink: RouterLinkStub, + RouterView: true, +} + +describe('App', () => { + const mocks = { + $i18n: { + locale: 'en', + }, + $t: jest.fn((t) => t), + $store: { + commit: mockStoreCommit, + state: { + token: null, + }, + }, + $route: { + meta: { + requiresAuth: false, + }, + }, + } + + let wrapper + + const Wrapper = () => { + return mount(App, { localVue, mocks, stubs }) + } + + describe('mount', () => { + beforeEach(() => { + wrapper = Wrapper() + }) + + it('renders the App', () => { + expect(wrapper.find('#app').exists()).toBe(true) + }) + + it('has a component AuthLayout', () => { + expect(wrapper.findComponent({ name: 'AuthLayout' }).exists()).toBe(true) + }) + + describe('route requires authorization', () => { + beforeEach(() => { + mocks.$route.meta.requiresAuth = true + wrapper = Wrapper() + }) + + it('has a component DashboardLayout', () => { + expect(wrapper.findComponent({ name: 'DashboardLayout' }).exists()).toBe(true) + }) + }) + }) +}) diff --git a/frontend/src/App.vue b/frontend/src/App.vue index d7945ec69..b7d4d1154 100755 --- a/frontend/src/App.vue +++ b/frontend/src/App.vue @@ -1,47 +1,49 @@ + diff --git a/frontend/src/assets/scss/custom/gradido-custom/_body.scss b/frontend/src/assets/scss/custom/gradido-custom/_body.scss index df8f91f5b..b45b24b18 100644 --- a/frontend/src/assets/scss/custom/gradido-custom/_body.scss +++ b/frontend/src/assets/scss/custom/gradido-custom/_body.scss @@ -1,4 +1,4 @@ // Body -$body-bg: #f8f9fe !default; +$body-bg: #fff !default; $body-color: $gray-700 !default; diff --git a/frontend/src/assets/scss/custom/gradido-custom/_custom-forms.scss b/frontend/src/assets/scss/custom/gradido-custom/_custom-forms.scss index 4fa437b38..0d9fb946e 100644 --- a/frontend/src/assets/scss/custom/gradido-custom/_custom-forms.scss +++ b/frontend/src/assets/scss/custom/gradido-custom/_custom-forms.scss @@ -16,7 +16,9 @@ $custom-control-indicator-active-bg: $component-active-bg !default; $custom-control-indicator-active-border-color: $component-active-border-color !default; $custom-control-indicator-active-box-shadow: $custom-control-indicator-box-shadow !default; $custom-control-indicator-checked-color: $component-active-color !default; -$custom-control-indicator-checked-bg: $component-active-bg !default; + +// $custom-control-indicator-checked-bg: $component-active-bg !default; +$custom-control-indicator-checked-bg: #047006 !default; $custom-control-indicator-checked-border-color: $component-active-border-color !default; $custom-control-indicator-checked-box-shadow: $custom-control-indicator-box-shadow !default; @@ -24,6 +26,8 @@ $custom-control-indicator-checked-box-shadow: $custom-control-indicator-box-shad $custom-control-indicator-checked-disabled-bg: theme-color("primary") !default; $custom-control-indicator-disabled-bg: $gray-200 !default; $custom-control-label-disabled-color: $gray-600 !default; -$custom-checkbox-indicator-border-radius: $border-radius-sm !default; + +// $custom-checkbox-indicator-border-radius: $border-radius-sm !default; +$custom-checkbox-indicator-border-radius: 50px !default; // $custom-checkbox-indicator-icon-checked: str-replace(url("data:image/svg+xml !default;charset=utf8,%3Csvg xmlns='http://www.w3.org/2000/svg' viewBox='0 0 8 8'%3E%3Cpath fill='%23fff' d='M6.564.75l-3.59 3.612-1.538-1.55L0 4.26 2.974 7.25 8 2.193z'/%3E%3C/svg%3E"), "#", "%23") !default; diff --git a/frontend/src/assets/scss/custom/gradido-custom/_grid-breakpoint.scss b/frontend/src/assets/scss/custom/gradido-custom/_grid-breakpoint.scss index af4296cf6..b6dc059fb 100644 --- a/frontend/src/assets/scss/custom/gradido-custom/_grid-breakpoint.scss +++ b/frontend/src/assets/scss/custom/gradido-custom/_grid-breakpoint.scss @@ -4,7 +4,7 @@ $grid-breakpoints: ( xs: 0, sm: 576px, md: 768px, - lg: 992px, + lg: 1025px, xl: 1200px ); diff --git a/frontend/src/assets/scss/custom/gradido-custom/_sections.scss b/frontend/src/assets/scss/custom/gradido-custom/_sections.scss index df8f91f5b..f696cc4cd 100644 --- a/frontend/src/assets/scss/custom/gradido-custom/_sections.scss +++ b/frontend/src/assets/scss/custom/gradido-custom/_sections.scss @@ -1,4 +1,13 @@ -// Body +// Sections -$body-bg: #f8f9fe !default; -$body-color: $gray-700 !default; +// $section-colors: () !default; +// $section-colors: map-merge( +// ( +// "primary": $body-bg, +// "secondary": $secondary, +// "light": $gray-400, +// "dark": $dark, +// "darker": $darker +// ), +// $section-colors +// ); diff --git a/frontend/src/assets/scss/fonts/WorkSans-VariableFont_wght.ttf b/frontend/src/assets/scss/fonts/WorkSans-VariableFont_wght.ttf new file mode 100644 index 000000000..09829a516 Binary files /dev/null and b/frontend/src/assets/scss/fonts/WorkSans-VariableFont_wght.ttf differ diff --git a/frontend/src/assets/scss/gradido-template.scss b/frontend/src/assets/scss/gradido-template.scss new file mode 100644 index 000000000..09c8588e9 --- /dev/null +++ b/frontend/src/assets/scss/gradido-template.scss @@ -0,0 +1,217 @@ +html, +body { + height: 100%; +} + +.pointer { + cursor: pointer; +} + +.c-grey { + color: #383838 !important; +} + +.c-blau { + color: #0e79bc !important; +} + +/* Navbar */ +a, +.navbar-light, +.navbar-nav, +.nav-link { + color: #047006; +} + +a:hover, +.nav-link:hover { + color: #383838 !important; +} + +.navbar-light .navbar-nav .nav-link.active { + color: rgb(35 121 188 / 90%); +} + +.text-gradido { + color: rgb(249 205 105 / 100%); +} + +.gradient-gradido { + background-image: linear-gradient(146deg, rgb(220 167 44) 50%, rgb(197 141 56 / 100%) 100%); +} + +/* Button */ +.btn { + border-radius: 25px; +} + +.btn-gradido { + display: inline-block; + padding: 0.6em 3em; + letter-spacing: 0.05em; + color: #fff; + transition: all 0.5s ease; + background: rgb(249 205 105); + background: linear-gradient(135deg, rgb(249 205 105 / 100%) 2%, rgb(197 141 56 / 100%) 55%); + box-shadow: rgb(0 0 0 / 40%) 0 30px 90px; + border-radius: 26px; + padding-right: 50px; + padding-left: 50px; + border-style: none; +} + +.btn-gradido:hover { + color: #fff; + box-shadow: 0 5px 10px rgb(0 0 0 / 20%); +} + +.btn-gradido:focus { + outline: none; +} + +.btn-gradido-disable { + padding: 0.6em 3em; + letter-spacing: 0.05em; + color: #fff; + transition: all 0.5s ease; + background: rgb(97 97 97); + background: linear-gradient(135deg, rgb(180 180 180 / 100%) 46%, rgb(180 180 180 / 100%) 99%); + box-shadow: rgb(0 0 0 / 40%) 0 30px 90px; + border-radius: 26px; + padding-right: 50px; + padding-left: 50px; + border-style: none; +} + +.btn-gradido-disable:hover { + color: #fff; +} + +.btn-outline-gradido { + color: rgb(140 121 88); + border: 1px solid #f5b805; + box-shadow: 10px 10px 50px 10px rgb(56 56 56 / 31%); +} + +.btn-outline-gradido:hover { + box-shadow: 10px 10px 50px 10px rgb(56 56 56 / 0%); +} + +.form-control, +.custom-select { + border-radius: 17px; + height: 50px; +} + +.rounded-right { + border-top-right-radius: 17px !important; + border-bottom-right-radius: 17px !important; +} + +.alert-success { + background-color: #d4edda; + border-color: #c3e6cb; + color: #155724; +} + +.alert-danger { + color: #721c24; + background-color: #f8d7da; + border-color: #f5c6cb; +} + +.b-toast-danger .toast .toast-header { + color: #721c24; + background-color: rgb(248 215 218 / 85%); + border-bottom-color: rgb(245 198 203 / 85%); +} + +.b-toast-danger .toast .toast-body { + background-color: rgb(252 237 238 / 85%); + border-color: rgb(245 198 203 / 85%); + color: #721c24; +} + +.b-toast-success .toast .toast-header { + color: #155724; + background-color: rgb(212 237 218 / 58%); + border-bottom-color: rgb(195 230 203 / 85%); +} + +.b-toast-success .toast .toast-body { + color: #155724; + background-color: rgb(212 237 218 / 85%); + border-bottom-color: rgb(195 230 203 / 85%); +} + +// .btn-primary pim { +.btn-primary { + background-color: #5a7b02; + border-color: #5e72e4; +} + +.gradido-font-large { + font-size: large; + height: auto !important; +} + +.font2em { + font-size: 1.5em; +} + +.zindex10 { + z-index: 10; +} + +.zindex100 { + z-index: 100; +} + +.zindex1000 { + z-index: 1000; +} + +.zindex10000 { + z-index: 10000; +} + +.zindex100000 { + z-index: 100000; +} + +.gradido-global-color-blue { + color: #0e79bc; +} + +.gradido-global-color-accent { + color: #047006; +} + +.gradido-global-color-gray { + color: #858383; +} + +.gradido-custom-background { + background-color: #ebebeba3 !important; + border-radius: 25pt; +} + +.gradido-width-300 { + width: 300px; +} + +.gradido-width-96 { + width: 96%; +} + +.gradido-no-border-radius { + border-radius: 0; +} + +.gradido-no-border { + border: 0; +} + +.gradido-font-15rem { + font-size: 1.5rem; +} diff --git a/frontend/src/assets/scss/gradido.scss b/frontend/src/assets/scss/gradido.scss index 68577d0b5..7366eb466 100644 --- a/frontend/src/assets/scss/gradido.scss +++ b/frontend/src/assets/scss/gradido.scss @@ -51,165 +51,4 @@ // Bootstrap-vue (2.21.1) scss @import "~bootstrap-vue/src/index"; - -.alert-success { - color: #155724; - background-color: #d4edda; - border-color: #c3e6cb; -} - -.alert-danger { - color: #721c24; - background-color: #f8d7da; - border-color: #f5c6cb; -} - -.b-toast-danger .toast .toast-header { - color: #721c24; - background-color: rgb(248 215 218 / 85%); - border-bottom-color: rgb(245 198 203 / 85%); -} - -.b-toast-danger .toast .toast-body { - background-color: rgb(252 237 238 / 85%); - border-color: rgb(245 198 203 / 85%); - color: #721c24; -} - -.b-toast-success .toast .toast-header { - color: #155724; - background-color: rgb(212 237 218 / 58%); - border-bottom-color: rgb(195 230 203 / 85%); -} - -.b-toast-success .toast .toast-body { - color: #155724; - background-color: rgb(212 237 218 / 85%); - border-bottom-color: rgb(195 230 203 / 85%); -} - -// .btn-primary pim { -.btn-primary { - background-color: #5a7b02; - border-color: #5e72e4; -} - -.gradido-font-large { - font-size: large; - height: auto !important; -} - -a, -.copyright { - color: #5a7b02; -} - -.font12em { - font-size: 1.2em; -} - -.font2em { - font-size: 1.5em; -} - -.gradido-global-color-text { - color: #3d443b; -} - -.gradido-global-color-accent { - color: #047006; -} - -.gradido-global-color-6e0a9c9e { - color: #000; -} - -.gradido-global-color-2d0fb154 { - color: #047006; -} - -.gradido-global-color-16efe88c { - color: #7ebc55; -} - -.gradido-global-color-1939326 { - color: #f6fff6; -} - -.gradido-global-color-9d79fc1 { - color: #047006; -} - -.gradido-global-color-6347f4d { - color: #5a7b02; -} - -.gradido-global-color-4fbc19a { - color: #014034; -} - -.gradido-global-color-d341874 { - color: #b6d939; -} - -.gradido-global-color-619d338 { - color: #8ebfb1; -} - -.gradido-global-color-44819a9 { - color: #026873; -} - -.gradido-global-color-gray { - color: #858383; -} - -.gradido-custom-background { - background-color: #ebebeba3 !important; -} - -.gradido-shadow-inset { - box-shadow: inset 0.3em rgba(241 187 187 / 100%); -} - -.gradido-max-width { - width: 100%; -} - -.gradido-width-300 { - width: 300px; -} - -.gradido-absolute { - position: absolute; -} - -.gradido-width-95-absolute { - width: 95%; - position: absolute; -} - -.gradido-width-96-absolute { - width: 96%; - position: absolute; -} - -.gradido-no-border-radius { - border-radius: 0; -} - -.gradido-no-border { - border: 0; -} - -.gradido-background-f1 { - background-color: #f1f1f1; -} - -.gradido-background-white { - background-color: #fff; -} - -.gradido-font-15rem { - font-size: 1.5rem; -} +@import "gradido-template"; diff --git a/frontend/src/components/Auth/AuthCarousel.vue b/frontend/src/components/Auth/AuthCarousel.vue new file mode 100644 index 000000000..db3037828 --- /dev/null +++ b/frontend/src/components/Auth/AuthCarousel.vue @@ -0,0 +1,33 @@ + + + + + diff --git a/frontend/src/components/Auth/AuthFooter.vue b/frontend/src/components/Auth/AuthFooter.vue new file mode 100644 index 000000000..d74593e36 --- /dev/null +++ b/frontend/src/components/Auth/AuthFooter.vue @@ -0,0 +1,61 @@ + + + + + diff --git a/frontend/src/components/Auth/AuthMobileStart.vue b/frontend/src/components/Auth/AuthMobileStart.vue new file mode 100644 index 000000000..7b11df0f3 --- /dev/null +++ b/frontend/src/components/Auth/AuthMobileStart.vue @@ -0,0 +1,127 @@ + + + + + diff --git a/frontend/src/components/Auth/AuthNavbar.vue b/frontend/src/components/Auth/AuthNavbar.vue new file mode 100644 index 000000000..7b3eed0e3 --- /dev/null +++ b/frontend/src/components/Auth/AuthNavbar.vue @@ -0,0 +1,73 @@ + + + + + diff --git a/frontend/src/components/Auth/AuthNavbarSmall.vue b/frontend/src/components/Auth/AuthNavbarSmall.vue new file mode 100644 index 000000000..836e72aeb --- /dev/null +++ b/frontend/src/components/Auth/AuthNavbarSmall.vue @@ -0,0 +1,17 @@ + + + diff --git a/frontend/src/components/DecayInformations/CollapseLinksList.vue b/frontend/src/components/DecayInformations/CollapseLinksList.vue index 0c1db5f0a..ce742e66e 100644 --- a/frontend/src/components/DecayInformations/CollapseLinksList.vue +++ b/frontend/src/components/DecayInformations/CollapseLinksList.vue @@ -1,7 +1,7 @@ + + diff --git a/frontend/src/components/Menu/Navbar.vue b/frontend/src/components/Menu/Navbar.vue index f998783f7..2f26f381e 100644 --- a/frontend/src/components/Menu/Navbar.vue +++ b/frontend/src/components/Menu/Navbar.vue @@ -1,5 +1,5 @@