mirror of
https://github.com/Ocelot-Social-Community/Ocelot-Social.git
synced 2025-12-13 07:46:06 +00:00
Merge branch 'master' of https://github.com/Ocelot-Social-Community/Ocelot-Social into dependabot/npm_and_yarn/webapp/storybook/addon-actions-5.3.21
This commit is contained in:
commit
7d2276ee05
29
.github/ISSUE_TEMPLATE/bug_report.md
vendored
29
.github/ISSUE_TEMPLATE/bug_report.md
vendored
@ -1,32 +1,9 @@
|
||||
---
|
||||
name: 🐛 Bug report
|
||||
about: Create a report to help us improve
|
||||
name: 🐛 Bug Report
|
||||
about: Create a report to help us to improve.
|
||||
labels: bug
|
||||
title: 🐛 [Bug]
|
||||
---
|
||||
|
||||
## :bug: Bugreport
|
||||
## :bug: Bug Report
|
||||
<!-- Describe your issue in detail. Include screenshots if needed. Give us as much information as possible. Use a clear and concise description of what the bug is.-->
|
||||
|
||||
|
||||
### Steps to reproduce the behavior
|
||||
1.
|
||||
2.
|
||||
3.
|
||||
4. ...
|
||||
5. Profit
|
||||
|
||||
|
||||
### Expected behavior
|
||||
<!-- A clear and concise description of what you expected to happen. -->
|
||||
|
||||
|
||||
### Version & Environment
|
||||
Type: [] <!-- [Desktop|Smartphone] -->
|
||||
- OS: [] <!-- [e.g. iOS8.1 or Windows] -->
|
||||
- Browser: [] <!-- [e.g. stock browser, safari, chrome] -->
|
||||
- Version [] <!-- [e.g. 22] -->
|
||||
- Device: [] <!-- [e.g. iPhone6] -->
|
||||
|
||||
### Additional context
|
||||
<!-- Add any other context about the problem here. -->
|
||||
|
||||
21
.github/ISSUE_TEMPLATE/devops_ticket.md
vendored
21
.github/ISSUE_TEMPLATE/devops_ticket.md
vendored
@ -1,24 +1,9 @@
|
||||
---
|
||||
name: 💥 DevOps ticket
|
||||
about: Help us manage our deployed App.
|
||||
name: 💥 DevOps Ticket
|
||||
about: Help us manage our deployed app.
|
||||
labels: devops
|
||||
title: 💥 [DevOps]
|
||||
---
|
||||
|
||||
## :fire: DevOps ticket
|
||||
## 💥 DevOps Ticket
|
||||
<!-- Describe your issue in detail. Include screenshots if needed. Give us as much information as possible. Use a clear and concise description of what the problem is.-->
|
||||
|
||||
### Motive
|
||||
<!-- Why does this task need to be done? What can we benefit from this? -->
|
||||
|
||||
### Related issues
|
||||
<!-- Are there any related issues to link to? Please paste them below for reference. -->
|
||||
|
||||
### Implementation
|
||||
<!-- Please, document any ideas of how the task can be performed. -->
|
||||
|
||||
### Validation
|
||||
<!-- How can we make sure that this task was successful? -->
|
||||
|
||||
### Additional context
|
||||
<!-- Add other context or background about the feature request here.-->
|
||||
|
||||
12
.github/ISSUE_TEMPLATE/epic.md
vendored
Normal file
12
.github/ISSUE_TEMPLATE/epic.md
vendored
Normal file
@ -0,0 +1,12 @@
|
||||
---
|
||||
name: 🌟 Epic
|
||||
about: Define a big development step.
|
||||
labels: epic
|
||||
title: 🌟 [EPIC]
|
||||
---
|
||||
<!-- THIS ISSUE-TYPE IS NOT FOR YOU! -->
|
||||
<!-- If you need an answer right away, visit the ocelot.social Discord:
|
||||
https://discord.gg/AJSX9DCSUA -->
|
||||
|
||||
## 🌟 EPIC
|
||||
<!-- Describe your Epic in detail. Include screenshots and drawings -->
|
||||
21
.github/ISSUE_TEMPLATE/feature_request.md
vendored
21
.github/ISSUE_TEMPLATE/feature_request.md
vendored
@ -1,24 +1,9 @@
|
||||
---
|
||||
name: 🚀 Feature request
|
||||
about: Suggest an idea for this project
|
||||
name: 🚀 Feature Request
|
||||
about: Suggest an idea for this project.
|
||||
labels: feature
|
||||
title: 🚀 [Feature]
|
||||
---
|
||||
|
||||
## :rocket: Feature
|
||||
## :rocket: Feature Request
|
||||
<!-- Give a short summary of the Feature. Use Screenshots if you want. -->
|
||||
|
||||
### User Problem
|
||||
<!-- Which problem is this solving? Why do you think this is important? Who will benefit from it and how? -->
|
||||
|
||||
### Implementation
|
||||
<!-- How do you think this feature should be implemented? How will it be used? Where in the network should it be located? Which steps and screens are involved? -->
|
||||
|
||||
### Design & Layout
|
||||
<!-- Attach Screenshots and Sketches to illustrate your idea. -->
|
||||
|
||||
### Validation
|
||||
<!-- How can we make sure that this feature indeed solves the above problem? How do we know if it has been accepted by the users of the network, once released? -->
|
||||
|
||||
### Additional context
|
||||
<!-- Add other context or background about the feature request here.-->
|
||||
|
||||
10
.github/ISSUE_TEMPLATE/question.md
vendored
10
.github/ISSUE_TEMPLATE/question.md
vendored
@ -1,12 +1,12 @@
|
||||
---
|
||||
name: 💬 Question
|
||||
about: If you need help understanding HumanConnection.
|
||||
about: If you need help understanding ocelot.social.
|
||||
labels: question
|
||||
title: 💬 [Question]
|
||||
---
|
||||
<!-- Chat with Team HumanConnection -->
|
||||
<!-- If you need an answer right away, visit the HumanConnection Discord:
|
||||
https://discord.gg/Q3mpcgr -->
|
||||
<!-- Chat with ocelot.social team -->
|
||||
<!-- If you need an answer right away, visit the ocelot.social Discord:
|
||||
https://discord.gg/AJSX9DCSUA -->
|
||||
|
||||
## :speech_balloon: Question
|
||||
## 💬 Question
|
||||
<!-- Describe your Question in detail. Include screenshots and drawings if needed. -->
|
||||
|
||||
14
.github/ISSUE_TEMPLATE/refactor_tickets.md
vendored
14
.github/ISSUE_TEMPLATE/refactor_tickets.md
vendored
@ -1,20 +1,10 @@
|
||||
---
|
||||
name: 🔧 Refactor ticket
|
||||
name: 🔧 Refactor
|
||||
about: Help us improve our code by refactoring it.
|
||||
labels: refactor
|
||||
title: 🔧 [Refactor]
|
||||
---
|
||||
|
||||
## :zap: Refactor ticket
|
||||
## 🔧 Refactor
|
||||
<!-- Describe your issue in detail. Include screenshots if needed. Give us as much information as possible. Use a clear and concise description of what the problem is.-->
|
||||
|
||||
### Motive
|
||||
<!-- What is the purpose of this refactoring? If it's removing depcrecated code, please link to the deprecation notice. -->
|
||||
### Related issues
|
||||
<!-- Are there any related issues to link to? Please paste them below for reference. -->
|
||||
|
||||
### Implementation
|
||||
<!-- Please, document any ideas of how the code should be refactored. -->
|
||||
|
||||
### Additional context
|
||||
<!-- Add other context or background about the feature request here.-->
|
||||
|
||||
179
.github/dependabot.yml
vendored
Normal file
179
.github/dependabot.yml
vendored
Normal file
@ -0,0 +1,179 @@
|
||||
version: 2
|
||||
updates:
|
||||
- package-ecosystem: npm
|
||||
directory: "/"
|
||||
schedule:
|
||||
interval: daily
|
||||
time: "04:00"
|
||||
open-pull-requests-limit: 10
|
||||
ignore:
|
||||
- dependency-name: cypress
|
||||
versions:
|
||||
- 6.3.0
|
||||
- 6.4.0
|
||||
- 6.5.0
|
||||
- 6.6.0
|
||||
- 6.7.1
|
||||
- 6.8.0
|
||||
- 7.0.0
|
||||
- 7.0.1
|
||||
- 7.1.0
|
||||
- dependency-name: cypress-cucumber-preprocessor
|
||||
versions:
|
||||
- 4.0.0
|
||||
- 4.0.1
|
||||
- 4.0.3
|
||||
- dependency-name: date-fns
|
||||
versions:
|
||||
- 2.16.1
|
||||
- 2.17.0
|
||||
- 2.18.0
|
||||
- 2.19.0
|
||||
- 2.20.0
|
||||
- 2.20.1
|
||||
- 2.20.2
|
||||
- 2.20.3
|
||||
- 2.21.0
|
||||
- dependency-name: cypress-file-upload
|
||||
versions:
|
||||
- 5.0.2
|
||||
- 5.0.3
|
||||
- 5.0.4
|
||||
- 5.0.5
|
||||
- dependency-name: neo4j-driver
|
||||
versions:
|
||||
- 4.2.2
|
||||
- package-ecosystem: npm
|
||||
directory: "/backend"
|
||||
schedule:
|
||||
interval: daily
|
||||
time: "04:00"
|
||||
open-pull-requests-limit: 10
|
||||
ignore:
|
||||
- dependency-name: y18n
|
||||
versions:
|
||||
- 4.0.1
|
||||
- 4.0.2
|
||||
- dependency-name: metascraper-publisher
|
||||
versions:
|
||||
- 5.16.16
|
||||
- 5.18.1
|
||||
- 5.18.12
|
||||
- 5.18.2
|
||||
- 5.18.4
|
||||
- 5.18.5
|
||||
- 5.18.6
|
||||
- 5.18.9
|
||||
- 5.20.0
|
||||
- 5.21.0
|
||||
- 5.21.2
|
||||
- 5.21.3
|
||||
- 5.21.4
|
||||
- 5.21.5
|
||||
- dependency-name: metascraper-author
|
||||
versions:
|
||||
- 5.16.16
|
||||
- 5.18.1
|
||||
- 5.18.12
|
||||
- 5.18.2
|
||||
- 5.18.4
|
||||
- 5.18.5
|
||||
- 5.18.6
|
||||
- 5.18.9
|
||||
- 5.20.0
|
||||
- 5.21.0
|
||||
- 5.21.2
|
||||
- 5.21.3
|
||||
- 5.21.4
|
||||
- 5.21.5
|
||||
- dependency-name: neo4j-driver
|
||||
versions:
|
||||
- 4.2.2
|
||||
- dependency-name: neo4j-graphql-js
|
||||
versions:
|
||||
- 2.19.1
|
||||
- dependency-name: mustache
|
||||
versions:
|
||||
- 4.1.0
|
||||
- package-ecosystem: npm
|
||||
directory: "/webapp"
|
||||
schedule:
|
||||
interval: daily
|
||||
time: "04:00"
|
||||
open-pull-requests-limit: 10
|
||||
ignore:
|
||||
- dependency-name: nuxt
|
||||
versions:
|
||||
- 2.14.12
|
||||
- 2.15.0
|
||||
- 2.15.1
|
||||
- 2.15.2
|
||||
- 2.15.3
|
||||
- dependency-name: v-tooltip
|
||||
versions:
|
||||
- 2.1.2
|
||||
- dependency-name: "@vue/server-test-utils"
|
||||
versions:
|
||||
- 1.1.2
|
||||
- 1.1.3
|
||||
- dependency-name: node-notifier
|
||||
versions:
|
||||
- 8.0.1
|
||||
- package-ecosystem: docker
|
||||
directory: "/webapp"
|
||||
schedule:
|
||||
interval: daily
|
||||
time: "04:00"
|
||||
open-pull-requests-limit: 10
|
||||
ignore:
|
||||
- dependency-name: node
|
||||
versions:
|
||||
- ">= 15.5.a, < 15.6"
|
||||
- dependency-name: node
|
||||
versions:
|
||||
- 15.10.0.pre.alpine3.10
|
||||
- 15.11.0.pre.alpine3.10
|
||||
- 15.12.0.pre.alpine3.10
|
||||
- 15.13.0.pre.alpine3.10
|
||||
- 15.7.0.pre.alpine3.10
|
||||
- 15.8.0.pre.alpine3.10
|
||||
- 15.9.0.pre.alpine3.10
|
||||
- package-ecosystem: docker
|
||||
directory: "/backend"
|
||||
schedule:
|
||||
interval: daily
|
||||
time: "04:00"
|
||||
open-pull-requests-limit: 10
|
||||
ignore:
|
||||
- dependency-name: node
|
||||
versions:
|
||||
- ">= 15.4.a, < 15.5"
|
||||
- dependency-name: node
|
||||
versions:
|
||||
- ">= 15.5.a, < 15.6"
|
||||
- dependency-name: node
|
||||
versions:
|
||||
- 15.10.0.pre.alpine3.10
|
||||
- 15.11.0.pre.alpine3.10
|
||||
- 15.12.0.pre.alpine3.10
|
||||
- 15.13.0.pre.alpine3.10
|
||||
- 15.7.0.pre.alpine3.10
|
||||
- 15.8.0.pre.alpine3.10
|
||||
- 15.9.0.pre.alpine3.10
|
||||
- package-ecosystem: docker
|
||||
directory: "/neo4j"
|
||||
schedule:
|
||||
interval: daily
|
||||
time: "04:00"
|
||||
open-pull-requests-limit: 10
|
||||
ignore:
|
||||
- dependency-name: neo4j
|
||||
versions:
|
||||
- 4.2.3
|
||||
- 4.2.4
|
||||
- package-ecosystem: docker
|
||||
directory: "/deployment/legacy-migration/maintenance-worker"
|
||||
schedule:
|
||||
interval: daily
|
||||
time: "04:00"
|
||||
open-pull-requests-limit: 10
|
||||
2
.github/semantic.yml
vendored
2
.github/semantic.yml
vendored
@ -1,2 +0,0 @@
|
||||
# Always validate the PR title, and ignore the commits
|
||||
titleOnly: true
|
||||
18
.github/stale-disabled.yml
vendored
18
.github/stale-disabled.yml
vendored
@ -1,18 +0,0 @@
|
||||
# Number of days of inactivity before an issue becomes stale
|
||||
daysUntilStale: 60
|
||||
# Number of days of inactivity before a stale issue is closed
|
||||
daysUntilClose: 30
|
||||
# Issues with these labels will never be considered stale
|
||||
exemptLabels:
|
||||
- pinned
|
||||
- security
|
||||
- bounty
|
||||
# Label to use when marking an issue as stale
|
||||
staleLabel: stale
|
||||
# Comment to post when marking an issue as stale. Set to `false` to disable
|
||||
markComment: >
|
||||
This issue has been automatically marked as stale because it has not had
|
||||
recent activity. It will be closed if no further activity occurs. Thank you
|
||||
for your contributions.
|
||||
# Comment to post when closing a stale issue. Set to `false` to disable
|
||||
closeComment: false
|
||||
46
.github/workflows/ci.yml
vendored
46
.github/workflows/ci.yml
vendored
@ -1,46 +0,0 @@
|
||||
name: CI
|
||||
|
||||
on: [push]
|
||||
|
||||
jobs:
|
||||
build:
|
||||
name: Continuous Integration
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- name: Checkout code
|
||||
uses: actions/checkout@v2
|
||||
|
||||
- name: Check translation files
|
||||
run: |
|
||||
scripts/translations/sort.sh
|
||||
scripts/translations/missing-keys.sh
|
||||
|
||||
- name: Build neo4j image
|
||||
uses: docker/build-push-action@v1.1.0
|
||||
with:
|
||||
repository: ocelotsocialnetwork/neo4j
|
||||
tags: latest
|
||||
path: neo4j/
|
||||
push: false
|
||||
- name: Build backend base image
|
||||
uses: docker/build-push-action@v1.1.0
|
||||
with:
|
||||
repository: ocelotsocialnetwork/backend
|
||||
tags: build-and-test
|
||||
target: build-and-test
|
||||
path: backend/
|
||||
push: false
|
||||
- name: Build webapp base image
|
||||
uses: docker/build-push-action@v1.1.0
|
||||
with:
|
||||
repository: ocelotsocialnetwork/webapp
|
||||
tags: build-and-test
|
||||
target: build-and-test
|
||||
path: webapp/
|
||||
push: false
|
||||
|
||||
- name: Lint backend
|
||||
run: docker run --rm ocelotsocialnetwork/backend:build-and-test yarn run lint
|
||||
- name: Lint webapp
|
||||
run: docker run --rm ocelotsocialnetwork/webapp:build-and-test yarn run lint
|
||||
|
||||
316
.github/workflows/publish.yml
vendored
Normal file
316
.github/workflows/publish.yml
vendored
Normal file
@ -0,0 +1,316 @@
|
||||
name: ocelot.social publish CI
|
||||
|
||||
on:
|
||||
push:
|
||||
branches:
|
||||
- master
|
||||
# - 4451-new-deployment-with-base-and-code # for testing while developing
|
||||
|
||||
jobs:
|
||||
##############################################################################
|
||||
# JOB: PREPARE ###############################################################
|
||||
##############################################################################
|
||||
prepare:
|
||||
name: Prepare
|
||||
runs-on: ubuntu-latest
|
||||
# needs: [nothing]
|
||||
steps:
|
||||
##########################################################################
|
||||
# CHECKOUT CODE ##########################################################
|
||||
##########################################################################
|
||||
- name: Checkout code
|
||||
uses: actions/checkout@v2
|
||||
##########################################################################
|
||||
# TODO: DO STUFF ??? #####################################################
|
||||
##########################################################################
|
||||
- name: Check translation files
|
||||
run: |
|
||||
scripts/translations/sort.sh
|
||||
scripts/translations/missing-keys.sh
|
||||
|
||||
##############################################################################
|
||||
# JOB: DOCKER BUILD COMMUNITY NEO4J ##########################################
|
||||
##############################################################################
|
||||
build_production_neo4j:
|
||||
name: Docker Build Production - Neo4J
|
||||
runs-on: ubuntu-latest
|
||||
needs: [prepare]
|
||||
steps:
|
||||
##########################################################################
|
||||
# CHECKOUT CODE ##########################################################
|
||||
##########################################################################
|
||||
- name: Checkout code
|
||||
uses: actions/checkout@v2
|
||||
##########################################################################
|
||||
# SET ENVS ###############################################################
|
||||
##########################################################################
|
||||
- name: ENV - VERSION
|
||||
run: echo "VERSION=$(node -p -e "require('./package.json').version")" >> $GITHUB_ENV
|
||||
- name: ENV - BUILD_DATE
|
||||
run: echo "BUILD_DATE=$(date -u +'%Y-%m-%dT%H:%M:%SZ')" >> $GITHUB_ENV
|
||||
- name: ENV - BUILD_VERSION
|
||||
run: echo "BUILD_VERSION=${VERSION}-${GITHUB_RUN_NUMBER}" >> $GITHUB_ENV
|
||||
- name: ENV - BUILD_COMMIT
|
||||
run: echo "BUILD_COMMIT=${GITHUB_SHA}" >> $GITHUB_ENV
|
||||
##########################################################################
|
||||
# NEO4J ##################################################################
|
||||
##########################################################################
|
||||
- name: Neo4J | Build `community` image
|
||||
run: docker build --target community -t "ocelotsocialnetwork/neo4j:latest" -t "ocelotsocialnetwork/neo4j:community" -t "ocelotsocialnetwork/neo4j:${VERSION}" -t "ocelotsocialnetwork/neo4j:${BUILD_VERSION}" --build-arg BBUILD_DATE=$BUILD_DATE --build-arg BBUILD_VERSION=$BUILD_VERSION --build-arg BBUILD_COMMIT=$BUILD_COMMIT neo4j/
|
||||
- name: Neo4J | Save docker image
|
||||
run: docker save "ocelotsocialnetwork/neo4j" > /tmp/neo4j.tar
|
||||
- name: Upload Artifact
|
||||
uses: actions/upload-artifact@v2
|
||||
with:
|
||||
name: docker-neo4j-community
|
||||
path: /tmp/neo4j.tar
|
||||
|
||||
##############################################################################
|
||||
# JOB: DOCKER BUILD PRODUCTION BACKEND #######################################
|
||||
##############################################################################
|
||||
build_production_backend:
|
||||
name: Docker Build Production - Backend
|
||||
runs-on: ubuntu-latest
|
||||
needs: [prepare]
|
||||
steps:
|
||||
##########################################################################
|
||||
# CHECKOUT CODE ##########################################################
|
||||
##########################################################################
|
||||
- name: Checkout code
|
||||
uses: actions/checkout@v2
|
||||
##########################################################################
|
||||
# SET ENVS ###############################################################
|
||||
##########################################################################
|
||||
- name: ENV - VERSION
|
||||
run: echo "VERSION=$(node -p -e "require('./package.json').version")" >> $GITHUB_ENV
|
||||
- name: ENV - BUILD_DATE
|
||||
run: echo "BUILD_DATE=$(date -u +'%Y-%m-%dT%H:%M:%SZ')" >> $GITHUB_ENV
|
||||
- name: ENV - BUILD_VERSION
|
||||
run: echo "BUILD_VERSION=${VERSION}-${GITHUB_RUN_NUMBER}" >> $GITHUB_ENV
|
||||
- name: ENV - BUILD_COMMIT
|
||||
run: echo "BUILD_COMMIT=${GITHUB_SHA}" >> $GITHUB_ENV
|
||||
##########################################################################
|
||||
# BUILD BACKEND DOCKER IMAGE (production) ################################
|
||||
##########################################################################
|
||||
- name: Backend | Build `production` image
|
||||
run: |
|
||||
docker build --target base -t "ocelotsocialnetwork/backend:latest-base" -t "ocelotsocialnetwork/backend:${VERSION}-base" -t "ocelotsocialnetwork/backend:${BUILD_VERSION}-base" --build-arg BBUILD_DATE=$BUILD_DATE --build-arg BBUILD_VERSION=$BUILD_VERSION --build-arg BBUILD_COMMIT=$BUILD_COMMIT backend/
|
||||
docker build --target code -t "ocelotsocialnetwork/backend:latest-code" -t "ocelotsocialnetwork/backend:${VERSION}-code" -t "ocelotsocialnetwork/backend:${BUILD_VERSION}-code" --build-arg BBUILD_DATE=$BUILD_DATE --build-arg BBUILD_VERSION=$BUILD_VERSION --build-arg BBUILD_COMMIT=$BUILD_COMMIT backend/
|
||||
docker build --target production -t "ocelotsocialnetwork/backend:latest" -t "ocelotsocialnetwork/backend:${VERSION}" -t "ocelotsocialnetwork/backend:${BUILD_VERSION}" --build-arg BBUILD_DATE=$BUILD_DATE --build-arg BBUILD_VERSION=$BUILD_VERSION --build-arg BBUILD_COMMIT=$BUILD_COMMIT backend/
|
||||
- name: Backend | Save docker image
|
||||
run: docker save "ocelotsocialnetwork/backend" > /tmp/backend.tar
|
||||
- name: Upload Artifact
|
||||
uses: actions/upload-artifact@v2
|
||||
with:
|
||||
name: docker-backend-production
|
||||
path: /tmp/backend.tar
|
||||
|
||||
##############################################################################
|
||||
# JOB: DOCKER BUILD PRODUCTION WEBAPP ########################################
|
||||
##############################################################################
|
||||
build_production_webapp:
|
||||
name: Docker Build Production - WebApp
|
||||
runs-on: ubuntu-latest
|
||||
needs: [prepare]
|
||||
steps:
|
||||
##########################################################################
|
||||
# CHECKOUT CODE ##########################################################
|
||||
##########################################################################
|
||||
- name: Checkout code
|
||||
uses: actions/checkout@v2
|
||||
##########################################################################
|
||||
# SET ENVS ###############################################################
|
||||
##########################################################################
|
||||
- name: ENV - VERSION
|
||||
run: echo "VERSION=$(node -p -e "require('./package.json').version")" >> $GITHUB_ENV
|
||||
- name: ENV - BUILD_DATE
|
||||
run: echo "BUILD_DATE=$(date -u +'%Y-%m-%dT%H:%M:%SZ')" >> $GITHUB_ENV
|
||||
- name: ENV - BUILD_VERSION
|
||||
run: echo "BUILD_VERSION=${VERSION}-${GITHUB_RUN_NUMBER}" >> $GITHUB_ENV
|
||||
- name: ENV - BUILD_COMMIT
|
||||
run: echo "BUILD_COMMIT=${GITHUB_SHA}" >> $GITHUB_ENV
|
||||
##########################################################################
|
||||
# BUILD WEBAPP DOCKER IMAGE (build) ######################################
|
||||
##########################################################################
|
||||
- name: Webapp | Build `production` image
|
||||
run: |
|
||||
docker build --target base -t "ocelotsocialnetwork/webapp:latest-base" -t "ocelotsocialnetwork/webapp:${VERSION}-base" -t "ocelotsocialnetwork/webapp:${BUILD_VERSION}-base" --build-arg BBUILD_DATE=$BUILD_DATE --build-arg BBUILD_VERSION=$BUILD_VERSION --build-arg BBUILD_COMMIT=$BUILD_COMMIT webapp/
|
||||
docker build --target code -t "ocelotsocialnetwork/webapp:latest-code" -t "ocelotsocialnetwork/webapp:${VERSION}-code" -t "ocelotsocialnetwork/webapp:${BUILD_VERSION}-code" --build-arg BBUILD_DATE=$BUILD_DATE --build-arg BBUILD_VERSION=$BUILD_VERSION --build-arg BBUILD_COMMIT=$BUILD_COMMIT webapp/
|
||||
docker build --target production -t "ocelotsocialnetwork/webapp:latest" -t "ocelotsocialnetwork/webapp:${VERSION}" -t "ocelotsocialnetwork/webapp:${BUILD_VERSION}" --build-arg BBUILD_DATE=$BUILD_DATE --build-arg BBUILD_VERSION=$BUILD_VERSION --build-arg BBUILD_COMMIT=$BUILD_COMMIT webapp/
|
||||
- name: Webapp | Save docker image
|
||||
run: docker save "ocelotsocialnetwork/webapp" > /tmp/webapp.tar
|
||||
- name: Upload Artifact
|
||||
uses: actions/upload-artifact@v2
|
||||
with:
|
||||
name: docker-webapp-production
|
||||
path: /tmp/webapp.tar
|
||||
|
||||
##############################################################################
|
||||
# JOB: DOCKER BUILD PRODUCTION MAINTENANCE ###################################
|
||||
##############################################################################
|
||||
build_production_maintenance:
|
||||
name: Docker Build Production - Maintenance
|
||||
runs-on: ubuntu-latest
|
||||
needs: [prepare]
|
||||
steps:
|
||||
##########################################################################
|
||||
# CHECKOUT CODE ##########################################################
|
||||
##########################################################################
|
||||
- name: Checkout code
|
||||
uses: actions/checkout@v2
|
||||
##########################################################################
|
||||
# SET ENVS ###############################################################
|
||||
##########################################################################
|
||||
- name: ENV - VERSION
|
||||
run: echo "VERSION=$(node -p -e "require('./package.json').version")" >> $GITHUB_ENV
|
||||
- name: ENV - BUILD_DATE
|
||||
run: echo "BUILD_DATE=$(date -u +'%Y-%m-%dT%H:%M:%SZ')" >> $GITHUB_ENV
|
||||
- name: ENV - BUILD_VERSION
|
||||
run: echo "BUILD_VERSION=${VERSION}-${GITHUB_RUN_NUMBER}" >> $GITHUB_ENV
|
||||
- name: ENV - BUILD_COMMIT
|
||||
run: echo "BUILD_COMMIT=${GITHUB_SHA}" >> $GITHUB_ENV
|
||||
##########################################################################
|
||||
# BUILD MAINTENANCE DOCKER IMAGE (build) #################################
|
||||
##########################################################################
|
||||
- name: Maintenance | Build `production` image
|
||||
run: |
|
||||
docker build --target base -t "ocelotsocialnetwork/maintenance:latest-base" -t "ocelotsocialnetwork/maintenance:${VERSION}-base" -t "ocelotsocialnetwork/maintenance:${BUILD_VERSION}-base" --build-arg BBUILD_DATE=$BUILD_DATE --build-arg BBUILD_VERSION=$BUILD_VERSION --build-arg BBUILD_COMMIT=$BUILD_COMMIT webapp/ -f webapp/Dockerfile.maintenance
|
||||
docker build --target code -t "ocelotsocialnetwork/maintenance:latest-code" -t "ocelotsocialnetwork/maintenance:${VERSION}-code" -t "ocelotsocialnetwork/maintenance:${BUILD_VERSION}-code" --build-arg BBUILD_DATE=$BUILD_DATE --build-arg BBUILD_VERSION=$BUILD_VERSION --build-arg BBUILD_COMMIT=$BUILD_COMMIT webapp/ -f webapp/Dockerfile.maintenance
|
||||
docker build --target production -t "ocelotsocialnetwork/maintenance:latest" -t "ocelotsocialnetwork/maintenance:${VERSION}" -t "ocelotsocialnetwork/maintenance:${BUILD_VERSION}" --build-arg BBUILD_DATE=$BUILD_DATE --build-arg BBUILD_VERSION=$BUILD_VERSION --build-arg BBUILD_COMMIT=$BUILD_COMMIT webapp/ -f webapp/Dockerfile.maintenance
|
||||
- name: Maintenance | Save docker image
|
||||
run: docker save "ocelotsocialnetwork/maintenance" > /tmp/maintenance.tar
|
||||
- name: Upload Artifact
|
||||
uses: actions/upload-artifact@v2
|
||||
with:
|
||||
name: docker-maintenance-production
|
||||
path: /tmp/maintenance.tar
|
||||
|
||||
##############################################################################
|
||||
# JOB: UPLOAD TO DOCKERHUB ###################################################
|
||||
##############################################################################
|
||||
upload_to_dockerhub:
|
||||
name: Upload to Dockerhub
|
||||
runs-on: ubuntu-latest
|
||||
needs: [build_production_neo4j,build_production_backend,build_production_webapp,build_production_maintenance]
|
||||
env:
|
||||
DOCKERHUB_USERNAME: ${{ secrets.DOCKERHUB_USERNAME }}
|
||||
DOCKERHUB_TOKEN: ${{ secrets.DOCKERHUB_TOKEN }}
|
||||
steps:
|
||||
##########################################################################
|
||||
# CHECKOUT CODE ##########################################################
|
||||
##########################################################################
|
||||
- name: Checkout code
|
||||
uses: actions/checkout@v2
|
||||
##########################################################################
|
||||
# DOWNLOAD DOCKER IMAGES #################################################
|
||||
##########################################################################
|
||||
- name: Download Docker Image (Neo4J)
|
||||
uses: actions/download-artifact@v2
|
||||
with:
|
||||
name: docker-neo4j-community
|
||||
path: /tmp
|
||||
- name: Load Docker Image
|
||||
run: docker load < /tmp/neo4j.tar
|
||||
- name: Download Docker Image (Backend)
|
||||
uses: actions/download-artifact@v2
|
||||
with:
|
||||
name: docker-backend-production
|
||||
path: /tmp
|
||||
- name: Load Docker Image
|
||||
run: docker load < /tmp/backend.tar
|
||||
- name: Download Docker Image (WebApp)
|
||||
uses: actions/download-artifact@v2
|
||||
with:
|
||||
name: docker-webapp-production
|
||||
path: /tmp
|
||||
- name: Load Docker Image
|
||||
run: docker load < /tmp/webapp.tar
|
||||
- name: Download Docker Image (Maintenance)
|
||||
uses: actions/download-artifact@v2
|
||||
with:
|
||||
name: docker-maintenance-production
|
||||
path: /tmp
|
||||
- name: Load Docker Image
|
||||
run: docker load < /tmp/maintenance.tar
|
||||
##########################################################################
|
||||
# Upload #################################################################
|
||||
##########################################################################
|
||||
- name: login to dockerhub
|
||||
run: echo "${DOCKERHUB_TOKEN}" | docker login -u "${DOCKERHUB_USERNAME}" --password-stdin
|
||||
- name: Push neo4j
|
||||
run: docker push --all-tags ocelotsocialnetwork/neo4j
|
||||
- name: Push backend
|
||||
run: docker push --all-tags ocelotsocialnetwork/backend
|
||||
- name: Push webapp
|
||||
run: docker push --all-tags ocelotsocialnetwork/webapp
|
||||
- name: Push maintenance
|
||||
run: docker push --all-tags ocelotsocialnetwork/maintenance
|
||||
|
||||
##############################################################################
|
||||
# JOB: GITHUB TAG LATEST VERSION #############################################
|
||||
##############################################################################
|
||||
github_tag:
|
||||
name: Tag latest version on Github
|
||||
runs-on: ubuntu-latest
|
||||
needs: [upload_to_dockerhub]
|
||||
steps:
|
||||
##########################################################################
|
||||
# CHECKOUT CODE ##########################################################
|
||||
##########################################################################
|
||||
- name: Checkout code
|
||||
uses: actions/checkout@v2
|
||||
with:
|
||||
fetch-depth: 0 # Fetch full History for changelog
|
||||
##########################################################################
|
||||
# SET ENVS ###############################################################
|
||||
##########################################################################
|
||||
- name: ENV - VERSION
|
||||
run: echo "VERSION=$(node -p -e "require('./package.json').version")" >> $GITHUB_ENV
|
||||
- name: ENV - BUILD_DATE
|
||||
run: echo "BUILD_DATE=$(date -u +'%Y-%m-%dT%H:%M:%SZ')" >> $GITHUB_ENV
|
||||
- name: ENV - BUILD_VERSION
|
||||
run: echo "BUILD_VERSION=${VERSION}-${GITHUB_RUN_NUMBER}" >> $GITHUB_ENV
|
||||
- name: ENV - BUILD_COMMIT
|
||||
run: echo "BUILD_COMMIT=${GITHUB_SHA}" >> $GITHUB_ENV
|
||||
##########################################################################
|
||||
# Push version tag to GitHub #############################################
|
||||
##########################################################################
|
||||
# TODO: this will error on duplicate
|
||||
#- name: package-version-to-git-tag
|
||||
# uses: pkgdeps/git-tag-action@v2
|
||||
# with:
|
||||
# github_token: ${{ secrets.GITHUB_TOKEN }}
|
||||
# github_repo: ${{ github.repository }}
|
||||
# version: ${{ env.VERSION }}
|
||||
# git_commit_sha: ${{ github.sha }}
|
||||
# git_tag_prefix: "v"
|
||||
##########################################################################
|
||||
# Push build tag to GitHub ###############################################
|
||||
##########################################################################
|
||||
- name: package-version-to-git-tag + build number
|
||||
uses: pkgdeps/git-tag-action@v2
|
||||
with:
|
||||
github_token: ${{ secrets.GITHUB_TOKEN }}
|
||||
github_repo: ${{ github.repository }}
|
||||
version: ${{ env.BUILD_VERSION }}
|
||||
git_commit_sha: ${{ github.sha }}
|
||||
git_tag_prefix: "b"
|
||||
##########################################################################
|
||||
# Push release tag to GitHub #############################################
|
||||
##########################################################################
|
||||
- name: yarn install
|
||||
run: yarn install
|
||||
- name: generate changelog
|
||||
run: yarn auto-changelog --latest-version ${{ env.VERSION }} --unreleased-only
|
||||
- name: package-version-to-git-release
|
||||
continue-on-error: true # Will fail if tag exists
|
||||
id: create_release
|
||||
uses: actions/create-release@v1
|
||||
env:
|
||||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} # This token is provided by Actions, you do not need to create your own token
|
||||
with:
|
||||
tag_name: ${{ env.VERSION }}
|
||||
release_name: ${{ env.VERSION }}
|
||||
body_path: ./CHANGELOG.md
|
||||
draft: false
|
||||
prerelease: false
|
||||
341
.github/workflows/test.yml
vendored
Normal file
341
.github/workflows/test.yml
vendored
Normal file
@ -0,0 +1,341 @@
|
||||
name: ocelot.social test CI
|
||||
|
||||
|
||||
on: [push]
|
||||
|
||||
jobs:
|
||||
##############################################################################
|
||||
# JOB: PREPARE #####################################################
|
||||
##############################################################################
|
||||
prepare:
|
||||
name: Prepare
|
||||
runs-on: ubuntu-latest
|
||||
# needs: [nothing]
|
||||
steps:
|
||||
##########################################################################
|
||||
# CHECKOUT CODE ##########################################################
|
||||
##########################################################################
|
||||
- name: Checkout code
|
||||
uses: actions/checkout@v2
|
||||
##########################################################################
|
||||
# TODO: DO STUFF ??? #####################################################
|
||||
##########################################################################
|
||||
- name: Check translation files
|
||||
run: |
|
||||
scripts/translations/sort.sh
|
||||
scripts/translations/missing-keys.sh
|
||||
|
||||
##############################################################################
|
||||
# JOB: DOCKER BUILD TEST NEO4J ###############################################
|
||||
##############################################################################
|
||||
build_test_neo4j:
|
||||
name: Docker Build Test - Neo4J
|
||||
runs-on: ubuntu-latest
|
||||
needs: [prepare]
|
||||
steps:
|
||||
##########################################################################
|
||||
# CHECKOUT CODE ##########################################################
|
||||
##########################################################################
|
||||
- name: Checkout code
|
||||
uses: actions/checkout@v2
|
||||
##########################################################################
|
||||
# NEO4J ##################################################################
|
||||
##########################################################################
|
||||
- name: Neo4J | Build `community` image
|
||||
run: |
|
||||
docker build --target community -t "ocelotsocialnetwork/neo4j:community" neo4j/
|
||||
docker save "ocelotsocialnetwork/neo4j:community" > /tmp/neo4j.tar
|
||||
- name: Upload Artifact
|
||||
uses: actions/upload-artifact@v2
|
||||
with:
|
||||
name: docker-neo4j-image
|
||||
path: /tmp/neo4j.tar
|
||||
|
||||
##############################################################################
|
||||
# JOB: DOCKER BUILD TEST BACKEND #############################################
|
||||
##############################################################################
|
||||
build_test_backend:
|
||||
name: Docker Build Test - Backend
|
||||
runs-on: ubuntu-latest
|
||||
needs: [prepare]
|
||||
steps:
|
||||
##########################################################################
|
||||
# CHECKOUT CODE ##########################################################
|
||||
##########################################################################
|
||||
- name: Checkout code
|
||||
uses: actions/checkout@v2
|
||||
##########################################################################
|
||||
# BUILD BACKEND DOCKER IMAGE (build) #####################################
|
||||
##########################################################################
|
||||
- name: backend | Build `test` image
|
||||
run: |
|
||||
docker build --target test -t "ocelotsocialnetwork/backend:test" backend/
|
||||
docker save "ocelotsocialnetwork/backend:test" > /tmp/backend.tar
|
||||
- name: Upload Artifact
|
||||
uses: actions/upload-artifact@v2
|
||||
with:
|
||||
name: docker-backend-test
|
||||
path: /tmp/backend.tar
|
||||
|
||||
##############################################################################
|
||||
# JOB: DOCKER BUILD TEST WEBAPP ##############################################
|
||||
##############################################################################
|
||||
build_test_webapp:
|
||||
name: Docker Build Test - WebApp
|
||||
runs-on: ubuntu-latest
|
||||
needs: [prepare]
|
||||
steps:
|
||||
##########################################################################
|
||||
# CHECKOUT CODE ##########################################################
|
||||
##########################################################################
|
||||
- name: Checkout code
|
||||
uses: actions/checkout@v2
|
||||
##########################################################################
|
||||
# BUILD WEBAPP DOCKER IMAGE (build) ######################################
|
||||
##########################################################################
|
||||
- name: webapp | Build `test` image
|
||||
run: |
|
||||
docker build --target test -t "ocelotsocialnetwork/webapp:test" webapp/
|
||||
docker save "ocelotsocialnetwork/webapp:test" > /tmp/webapp.tar
|
||||
- name: Upload Artifact
|
||||
uses: actions/upload-artifact@v2
|
||||
with:
|
||||
name: docker-webapp-test
|
||||
path: /tmp/webapp.tar
|
||||
|
||||
##############################################################################
|
||||
# JOB: LINT BACKEND ##########################################################
|
||||
##############################################################################
|
||||
lint_backend:
|
||||
name: Lint backend
|
||||
runs-on: ubuntu-latest
|
||||
needs: [build_test_backend]
|
||||
steps:
|
||||
##########################################################################
|
||||
# CHECKOUT CODE ##########################################################
|
||||
##########################################################################
|
||||
- name: Checkout code
|
||||
uses: actions/checkout@v2
|
||||
##########################################################################
|
||||
# DOWNLOAD DOCKER IMAGE ##################################################
|
||||
##########################################################################
|
||||
- name: Download Docker Image (Backend)
|
||||
uses: actions/download-artifact@v2
|
||||
with:
|
||||
name: docker-backend-test
|
||||
path: /tmp
|
||||
- name: Load Docker Image
|
||||
run: docker load < /tmp/backend.tar
|
||||
##########################################################################
|
||||
# LINT BACKEND ###########################################################
|
||||
##########################################################################
|
||||
- name: backend | Lint
|
||||
run: docker run --rm ocelotsocialnetwork/backend:test yarn run lint
|
||||
|
||||
##############################################################################
|
||||
# JOB: LINT WEBAPP ###########################################################
|
||||
##############################################################################
|
||||
lint_webapp:
|
||||
name: Lint webapp
|
||||
runs-on: ubuntu-latest
|
||||
needs: [build_test_webapp]
|
||||
steps:
|
||||
##########################################################################
|
||||
# CHECKOUT CODE ##########################################################
|
||||
##########################################################################
|
||||
- name: Checkout code
|
||||
uses: actions/checkout@v2
|
||||
##########################################################################
|
||||
# DOWNLOAD DOCKER IMAGE ##################################################
|
||||
##########################################################################
|
||||
- name: Download Docker Image (Webapp)
|
||||
uses: actions/download-artifact@v2
|
||||
with:
|
||||
name: docker-webapp-test
|
||||
path: /tmp
|
||||
- name: Load Docker Image
|
||||
run: docker load < /tmp/webapp.tar
|
||||
##########################################################################
|
||||
# LINT WEBAPP ############################################################
|
||||
##########################################################################
|
||||
- name: webapp | Lint
|
||||
run: docker run --rm ocelotsocialnetwork/webapp:test yarn run lint
|
||||
|
||||
##############################################################################
|
||||
# JOB: UNIT TEST BACKEND #####################################################
|
||||
##############################################################################
|
||||
unit_test_backend:
|
||||
name: Unit tests - backend
|
||||
runs-on: ubuntu-latest
|
||||
needs: [build_test_neo4j,build_test_backend]
|
||||
steps:
|
||||
##########################################################################
|
||||
# CHECKOUT CODE ##########################################################
|
||||
##########################################################################
|
||||
- name: Checkout code
|
||||
uses: actions/checkout@v2
|
||||
##########################################################################
|
||||
# DOWNLOAD DOCKER IMAGES #################################################
|
||||
##########################################################################
|
||||
- name: Download Docker Image (Neo4J)
|
||||
uses: actions/download-artifact@v2
|
||||
with:
|
||||
name: docker-neo4j-image
|
||||
path: /tmp
|
||||
- name: Load Docker Image
|
||||
run: docker load < /tmp/neo4j.tar
|
||||
- name: Download Docker Image (Backend)
|
||||
uses: actions/download-artifact@v2
|
||||
with:
|
||||
name: docker-backend-test
|
||||
path: /tmp
|
||||
- name: Load Docker Image
|
||||
run: docker load < /tmp/backend.tar
|
||||
##########################################################################
|
||||
# UNIT TESTS BACKEND #####################################################
|
||||
##########################################################################
|
||||
- name: backend | copy env files webapp
|
||||
run: cp webapp/.env.template webapp/.env
|
||||
- name: backend | copy env files backend
|
||||
run: cp backend/.env.template backend/.env
|
||||
- name: backend | docker-compose
|
||||
run: docker-compose -f docker-compose.yml -f docker-compose.test.yml up --detach --no-deps neo4j backend
|
||||
- name: backend | Initialize Database
|
||||
run: docker-compose exec -T backend yarn db:migrate init
|
||||
- name: backend | Unit test
|
||||
run: docker-compose exec -T backend yarn test
|
||||
##########################################################################
|
||||
# COVERAGE CHECK BACKEND #################################################
|
||||
##########################################################################
|
||||
- name: backend | Coverage check
|
||||
uses: webcraftmedia/coverage-check-action@master
|
||||
with:
|
||||
report_name: Coverage Backend
|
||||
type: lcov
|
||||
result_path: ./coverage/lcov.info
|
||||
min_coverage: 58
|
||||
token: ${{ github.token }}
|
||||
|
||||
##############################################################################
|
||||
# JOB: UNIT TEST WEBAPP ######################################################
|
||||
##############################################################################
|
||||
unit_test_webapp:
|
||||
name: Unit tests - webapp
|
||||
runs-on: ubuntu-latest
|
||||
needs: [build_test_webapp]
|
||||
steps:
|
||||
##########################################################################
|
||||
# CHECKOUT CODE ##########################################################
|
||||
##########################################################################
|
||||
- name: Checkout code
|
||||
uses: actions/checkout@v2
|
||||
##########################################################################
|
||||
# DOWNLOAD DOCKER IMAGES #################################################
|
||||
##########################################################################
|
||||
- name: Download Docker Image (Webapp)
|
||||
uses: actions/download-artifact@v2
|
||||
with:
|
||||
name: docker-webapp-test
|
||||
path: /tmp
|
||||
- name: Load Docker Image
|
||||
run: docker load < /tmp/webapp.tar
|
||||
##########################################################################
|
||||
# UNIT TESTS WEBAPP ######################################################
|
||||
##########################################################################
|
||||
- name: backend | copy env files webapp
|
||||
run: cp webapp/.env.template webapp/.env
|
||||
- name: backend | copy env files backend
|
||||
run: cp backend/.env.template backend/.env
|
||||
- name: backend | docker-compose
|
||||
run: docker-compose -f docker-compose.yml -f docker-compose.test.yml up --detach --no-deps webapp
|
||||
- name: webapp | Unit tests
|
||||
run: docker-compose exec -T webapp yarn test
|
||||
##########################################################################
|
||||
# COVERAGE REPORT FRONTEND ################################################
|
||||
##########################################################################
|
||||
#- name: frontend | Coverage report
|
||||
# uses: romeovs/lcov-reporter-action@v0.2.21
|
||||
# with:
|
||||
# github-token: ${{ secrets.GITHUB_TOKEN }}
|
||||
# lcov-file: ./coverage/lcov.info
|
||||
##########################################################################
|
||||
# COVERAGE CHECK WEBAPP ##################################################
|
||||
##########################################################################
|
||||
- name: webapp | Coverage check
|
||||
uses: webcraftmedia/coverage-check-action@master
|
||||
with:
|
||||
report_name: Coverage Webapp
|
||||
type: lcov
|
||||
result_path: ./coverage/lcov.info
|
||||
min_coverage: 65
|
||||
token: ${{ github.token }}
|
||||
|
||||
##############################################################################
|
||||
# JOB: FULLSTACK TESTS #######################################################
|
||||
##############################################################################
|
||||
fullstack_tests:
|
||||
name: Fullstack tests
|
||||
runs-on: ubuntu-latest
|
||||
needs: [build_test_webapp, build_test_backend, build_test_neo4j]
|
||||
env:
|
||||
jobs: 8
|
||||
strategy:
|
||||
matrix:
|
||||
# run copies of the current job in parallel
|
||||
job: [1, 2, 3, 4, 5, 6, 7, 8]
|
||||
steps:
|
||||
##########################################################################
|
||||
# CHECKOUT CODE ##########################################################
|
||||
##########################################################################
|
||||
- name: Checkout code
|
||||
uses: actions/checkout@v2
|
||||
##########################################################################
|
||||
# DOWNLOAD DOCKER IMAGES #################################################
|
||||
##########################################################################
|
||||
- name: Download Docker Image (Neo4J)
|
||||
uses: actions/download-artifact@v2
|
||||
with:
|
||||
name: docker-neo4j-image
|
||||
path: /tmp
|
||||
- name: Load Docker Image
|
||||
run: docker load < /tmp/neo4j.tar
|
||||
- name: Download Docker Image (Backend)
|
||||
uses: actions/download-artifact@v2
|
||||
with:
|
||||
name: docker-backend-test
|
||||
path: /tmp
|
||||
- name: Load Docker Image
|
||||
run: docker load < /tmp/backend.tar
|
||||
- name: Download Docker Image (Webapp)
|
||||
uses: actions/download-artifact@v2
|
||||
with:
|
||||
name: docker-webapp-test
|
||||
path: /tmp
|
||||
- name: Load Docker Image
|
||||
run: docker load < /tmp/webapp.tar
|
||||
##########################################################################
|
||||
# FULLSTACK TESTS CYPRESS ################################################
|
||||
##########################################################################
|
||||
- name: webapp | copy env files webapp
|
||||
run: cp webapp/.env.template webapp/.env
|
||||
- name: backend | copy env files backend
|
||||
run: cp backend/.env.template backend/.env
|
||||
- name: backend | docker-compose
|
||||
run: docker-compose -f docker-compose.yml -f docker-compose.test.yml up --detach --no-deps webapp neo4j backend
|
||||
- name: cypress | Fullstack tests
|
||||
run: |
|
||||
yarn install
|
||||
yarn run cypress:run --spec $(cypress/parallel-features.sh ${{ matrix.job }} ${{ env.jobs }} )
|
||||
##########################################################################
|
||||
# UPLOAD SCREENSHOTS & VIDEO #############################################
|
||||
##########################################################################
|
||||
- name: Upload Artifact
|
||||
uses: actions/upload-artifact@v2
|
||||
with:
|
||||
name: cypress-screenshots
|
||||
path: cypress/screenshots/
|
||||
- name: Upload Artifact
|
||||
uses: actions/upload-artifact@v2
|
||||
with:
|
||||
name: cypress-videos
|
||||
path: cypress/videos/
|
||||
18945
CHANGELOG.md
18945
CHANGELOG.md
File diff suppressed because it is too large
Load Diff
@ -4,31 +4,33 @@ Thank you so much for thinking of contributing to the Human Connection project!
|
||||
|
||||
## Getting Set Up
|
||||
|
||||
Instructions for how to install all the necessary software and some code guidelines can be found in our [documentation](https://docs.human-connection.org/human-connection/).
|
||||
Instructions for how to install all the necessary software and some code guidelines can be found in our main [Readme](/README.md) or in our [documentation](https://docs.human-connection.org/human-connection/).
|
||||
|
||||
To get you started we recommend that you join forces with a regular contributor. Please join [our discord instance](https://human-connection.org/discord) to chat with developers or just get in touch directly on an issue on either [Github](https://github.com/Ocelot-Social-Community/Ocelot-Social/issues) or [Zenhub](https://app.zenhub.com/workspaces/ocelotsocial-5fb21ff922cb410015dd6535/board?filterLogic=any&repos=301151089):
|
||||
To get you started we recommend that you join forces with a regular contributor. Please join [our Discord instance](https://discord.gg/AJSX9DCSUA) to chat with developers or just get in touch directly on an issue on either [Github](https://github.com/Ocelot-Social-Community/Ocelot-Social/issues) or [Zenhub](https://app.zenhub.com/workspaces/ocelotsocial-5fb21ff922cb410015dd6535/board?filterLogic=any&repos=301151089):
|
||||
|
||||

|
||||
|
||||
We also have regular pair programming sessions that you are very welcome to join! We feel this is often the best way to get to know both the project and the team. Most developers are also available for spontaneous sessions if the times listed below don't work for you – just ping us on discord.
|
||||
We also have regular pair programming sessions that you are very welcome to join! We feel this is often the best way to get to know both the project and the team. Most developers are also available for spontaneous sessions if the times listed below don't work for you – just ping us on Discord.
|
||||
|
||||
## Development Flow
|
||||
|
||||
We operate in two week sprints that are planned, estimated and prioritised on [Zenhub](https://app.zenhub.com/workspaces/ocelotsocial-5fb21ff922cb410015dd6535/board?filterLogic=any&repos=301151089). All issues are also linked to and synced with [Github](https://github.com/Ocelot-Social-Community/Ocelot-Social/issues). Look for the `good first issue` label if you're not sure where to start!
|
||||
|
||||
We try to discuss all questions directly related to a feature or bug in the respective issue, in order to preserve it for the future and for other developers. We use discord for real-time communication.
|
||||
We try to discuss all questions directly related to a feature or bug in the respective issue, in order to preserve it for the future and for other developers. We use Discord for real-time communication.
|
||||
|
||||
This is how we solve bugs and implement features, step by step:
|
||||
|
||||
1. We find an issue we want to work on, usually during the sprint planning but as an open source contributor this can happen at any time.
|
||||
2. We communicate with the team to see if the issue is still available. (When you comment on an issue but don't get an answer there within 1-2 days try to mention @Human-Connection/hc-dev-team to make sure we check in.)
|
||||
3. We make sure we understand the issue in detail – what problem is it solving and how should it be implemented?
|
||||
4. We assign ourselves to the issue and move it to `In Progress` on [Zenhub](https://app.zenhub.com/workspaces/human-connection-nitro-5c0154ecc699f60fc92cf11f).
|
||||
4. We assign ourselves to the issue and move it to `In Progress` on [Zenhub](https://app.zenhub.com/workspaces/ocelotsocial-5fb21ff922cb410015dd6535/board?filterLogic=any&repos=301151089).
|
||||
5. We start working on it in a `new branch` and open a `pull request` prefixed with `[WIP]` (work in progress) to which we regularly push our changes.
|
||||
6. When questions come up we clarify them with the team (directly in the issue on Github).
|
||||
7. When we are happy with our work and our PR is passing all tests we remove the `[WIP]` from the PR description and ask for reviews (if you're not sure who to ask there is @Human-Connection/hc-dev-team which pings all core developers).
|
||||
8. We then incorporate the suggestions from the reviews into our work and once it has been approved it can be merged into master!
|
||||
|
||||
Every pull request needs to:
|
||||
|
||||
* fix an issue (if there is something you want to work on but there is no issue for it, create one first and discuss it with the team)
|
||||
* include tests for the code that is added or changed
|
||||
* pass all tests (linter, backend, frontend, end-to-end)
|
||||
@ -38,37 +40,46 @@ Every pull request needs to:
|
||||
|
||||
There are many volunteers all around the world helping us build this network and without their contributions we wouldn't be where we are today. Big thank you to all of you!
|
||||
|
||||
You can see the core team behind Human Connection [on our website](https://human-connection.org/en/the-team/). On Github you will mostly run into our developers:
|
||||
* Robert (@roschaefer)
|
||||
* Matt (@mattwr18)
|
||||
You can talk to our core team on [Discord](https://discord.gg/AJSX9DCSUA). And on Github you will mostly run into our core developers:
|
||||
|
||||
* Ulf (@ulfgebhardt)
|
||||
* Moriz (@Mogge)
|
||||
* Wolle (@Tirokk)
|
||||
* Alex (@ogerly)
|
||||
|
||||
<!-- * Robert (@roschaefer)
|
||||
* Matt (@mattwr18)
|
||||
* Alina (@alina-beck)
|
||||
* Martin (@datenbrei), our head of IT
|
||||
* and sometimes Dennis (@DennisHack), the founder of Human Connection
|
||||
* and sometimes Dennis (@DennisHack), the founder of Human Connection -->
|
||||
|
||||
## Meetings and Pair Programming Sessions
|
||||
|
||||
Times below refer to **German Time** – that's CET (GMT+1) in winter and CEST (GMT+2) in summer – because most Human Connection core team members are living in Germany.
|
||||
|
||||
Daily standup
|
||||
* every Monday–Friday 11:30
|
||||
* in the discord `Conference Room`
|
||||
|
||||
* every Monday–Thursday 11:30 am (german time see above 👆🏼)
|
||||
* in our [Discord](https://discord.gg/AJSX9DCSUA) `Office Cube`
|
||||
* all contributors welcome!
|
||||
* everybody shares what they are working on and asks for help if they are blocked
|
||||
|
||||
<!--
|
||||
Regular pair programming sessions
|
||||
|
||||
* every Monday, Wednesday and Thursday 15:00
|
||||
* the link will be posted in the [discord chat](https://discord.gg/6ub73U3) and on the [Agile Ventures website](https://www.agileventures.org/events?utf8=%E2%9C%93&project_id=220&commit=Filter+by+Project)
|
||||
* the link will be posted in the [Discord chat](https://discord.gg/AJSX9DCSUA) and on the [Agile Ventures website](https://www.agileventures.org/events?utf8=%E2%9C%93&project_id=220&commit=Filter+by+Project)
|
||||
* all contributors welcome!
|
||||
* we team up and work on an issue together (often using Visual Studio live sharing sessions)
|
||||
|
||||
Open-Source Community Meeting
|
||||
|
||||
* bi-weekly on Mondays 13:00 (when there is no sprint retrospective)
|
||||
* the link will be posted in the [discord chat](https://discord.gg/6ub73U3) and on the [Agile Ventures website](https://www.agileventures.org/events?utf8=%E2%9C%93&project_id=220&commit=Filter+by+Project)
|
||||
* the link will be posted in the [Discord chat](https://discord.gg/AJSX9DCSUA) and on the [Agile Ventures website](https://www.agileventures.org/events?utf8=%E2%9C%93&project_id=220&commit=Filter+by+Project)
|
||||
* all contributors welcome!
|
||||
|
||||
Meet the team
|
||||
|
||||
* every Monday 21:00 (at the moment only in German)
|
||||
* details here https://human-connection.org/veranstaltungen/
|
||||
* via this [zoom link](https://zoom.us/j/936943532)
|
||||
@ -76,6 +87,7 @@ Meet the team
|
||||
* users of the network chat with the Human Connection team and discuss current questions and issues
|
||||
|
||||
Sprint planning
|
||||
|
||||
* bi-weekly on Tuesday 13:00
|
||||
* via this [zoom link](https://zoom.us/j/7743582385)
|
||||
* all contributors welcome (recommended for those who want to work on an issue in this sprint)
|
||||
@ -87,6 +99,7 @@ Sprint retrospective
|
||||
* via this [zoom link](https://zoom.us/j/7743582385)
|
||||
* all contributors welcome (most interesting for those who participated in the sprint)
|
||||
* we review the past sprint and talk about what went well and what we could improve
|
||||
-->
|
||||
|
||||
## Philosophy
|
||||
|
||||
@ -102,10 +115,9 @@ We use pair programming sessions as a tool for knowledge sharing. We can learn a
|
||||
|
||||
As a volunteeer you have no commitment except your own self development and your awesomeness by contributing to this free and open-source software project. Cheers to you!
|
||||
|
||||
|
||||
## Open-Source Bounties
|
||||
|
||||
There are so many good reasons to contribute to Human Connection
|
||||
There are so many good reasons to contribute to ocelot.social
|
||||
|
||||
* You learn state-of-the-art technologies
|
||||
* You build your portfolio
|
||||
@ -121,7 +133,7 @@ pull request approved and merged for free**. You can choose something really
|
||||
quick and easy. What's important is starting a working relationship with the
|
||||
team, learning the workflow, and understanding this contribution guide. You can
|
||||
filter issues by 'good first issue', to get an idea where to start. Please join
|
||||
our our [community chat](https://human-connection.org/discord), too.
|
||||
our our [Discord community chat](https://discord.gg/AJSX9DCSUA), too.
|
||||
|
||||
You can filter Github issues with label [bounty](https://github.com/Ocelot-Social-Community/Ocelot-Social/issues?q=is%3Aopen+is%3Aissue+label%3Abounty). These issues should have a second label `€<amount>`
|
||||
which indicate their respective financial compensation in Euros.
|
||||
|
||||
118
README.md
118
README.md
@ -1,4 +1,4 @@
|
||||
# Human-Connection
|
||||
# ocelot.social
|
||||
|
||||
[](https://travis-ci.com/Human-Connection/Human-Connection)
|
||||
[](https://codecov.io/gh/Human-Connection/Human-Connection/)
|
||||
@ -6,26 +6,19 @@
|
||||
[](https://discordapp.com/invite/DFSjPaX)
|
||||
[](https://www.codetriage.com/human-connection/human-connection)
|
||||
|
||||
Human Connection is a nonprofit social, action and knowledge network that connects information to action and promotes positive local and global change in all areas of life.
|
||||
ocelot.social is a nonprofit social, action and knowledge network that connects information to action and promotes positive local and global change in all areas of life.
|
||||
|
||||
* **Social**: Interact with other people not just by commenting their posts, but by providing **Pro & Contra** arguments, give a **Versus** or ask them by integrated **Chat** or **Let's Talk**
|
||||
* **Knowledge**: Read articles about interesting topics and find related posts in the **More Info** tab or by **Filtering** based on **Categories** and **Tagging** or by using the **Fulltext Search**.
|
||||
* **Action**: Don't just read about how to make the world a better place, but come into **Action** by following provided suggestions on the **Action** tab provided by other people or **Organisations**.
|
||||
|
||||
[](https://human-connection.org)
|
||||
|
||||
**Technology Stack**
|
||||
|
||||
* [VueJS](https://vuejs.org/)
|
||||
* [NuxtJS](https://nuxtjs.org/)
|
||||
* [GraphQL](https://graphql.org/)
|
||||
* [NodeJS](https://nodejs.org/en/)
|
||||
* [Neo4J](https://neo4j.com/)
|
||||
|
||||
<p align="center">
|
||||
<img src="webapp/static/img/custom/logo-squared.svg" alt="ocelot.social" width="40%" height="40%">
|
||||
</p>
|
||||
|
||||
## Live demo
|
||||
|
||||
Try out our deployed [development environment](https://develop.human-connection.org/).
|
||||
__Try out our deployed [development environment](https://develop.human-connection.org/).__
|
||||
|
||||
Logins:
|
||||
|
||||
@ -35,27 +28,105 @@ Logins:
|
||||
| `moderator@example.org` | 1234 | moderator |
|
||||
| `admin@example.org` | 1234 | admin |
|
||||
|
||||
## Documentation
|
||||
## Directory Layout
|
||||
|
||||
Learn how to set up a local development environment in our [Docs](https://docs.human-connection.org/human-connection/) :mag_right:
|
||||
There are four important directories:
|
||||
|
||||
## Translations
|
||||
* [Backend](./backend) runs on the server and is a middleware between database and frontend
|
||||
* [Frontend](./webapp) is a server-side-rendered and client-side-rendered web frontend
|
||||
* [Cypress](./cypress) contains end-to-end tests and executable feature specifications
|
||||
|
||||
You can help translating the interface by joining us on [lokalise.co](https://lokalise.co/public/556252725c18dd752dd546.13222042/).
|
||||
Thank you lokalise for providing us with a premium account :raised_hands:.
|
||||
In order to setup the application and start to develop features you have to
|
||||
setup **frontend** and **backend**.
|
||||
|
||||
There are two approaches:
|
||||
|
||||
1. Local installation, which means you have to take care of dependencies yourself
|
||||
2. **Or** Install everything through Docker which takes care of dependencies for you
|
||||
|
||||
## Installation
|
||||
|
||||
### Clone the Repository
|
||||
|
||||
Clone the repository, this will create a new folder called `Ocelot-Social`:
|
||||
|
||||
Using HTTPS:
|
||||
|
||||
```bash
|
||||
$ git clone https://github.com/Ocelot-Social-Community/Ocelot-Social.git
|
||||
```
|
||||
|
||||
Using SSH:
|
||||
|
||||
```bash
|
||||
$ git clone git@github.com:Ocelot-Social-Community/Ocelot-Social.git
|
||||
```
|
||||
|
||||
Change into the new folder.
|
||||
|
||||
```bash
|
||||
$ cd Ocelot-Social
|
||||
```
|
||||
|
||||
### Docker Installation
|
||||
|
||||
Docker is a software development container tool that combines software and its dependencies into one standardized unit that contains everything needed to run it. This helps us to avoid problems with dependencies and makes installation easier.
|
||||
|
||||
#### General Installation of Docker
|
||||
|
||||
There are [sevaral ways to install Docker CE](https://docs.docker.com/install/) on your computer or server.
|
||||
|
||||
* [install Docker Desktop on macOS](https://docs.docker.com/docker-for-mac/install/)
|
||||
* [install Docker Desktop on Windows](https://docs.docker.com/docker-for-windows/install/)
|
||||
* [install Docker CE on Linux](https://docs.docker.com/install/)
|
||||
|
||||
Check the correct Docker installation by checking the version before proceeding. E.g. we have the following versions:
|
||||
|
||||
```bash
|
||||
$ docker --version
|
||||
Docker version 18.09.2
|
||||
$ docker-compose --version
|
||||
docker-compose version 1.23.2
|
||||
```
|
||||
|
||||
#### Start Ocelot-Social via Docker-Compose
|
||||
|
||||
For Development:
|
||||
|
||||
```bash
|
||||
$ docker-compose up
|
||||
```
|
||||
|
||||
For Production:
|
||||
|
||||
```bash
|
||||
$ docker-compose -f docker-compose.yml up
|
||||
```
|
||||
|
||||
This will start all required Docker containers
|
||||
|
||||
## Deployment
|
||||
|
||||
Deployment methods can be found in the [Ocelot-Social-Deploy-Rebranding](https://github.com/Ocelot-Social-Community/Ocelot-Social-Deploy-Rebranding) repository.
|
||||
|
||||
The only deployment method in this repository is `docker-compose` for development purposes as described above.
|
||||
|
||||
## Developer Chat
|
||||
|
||||
Join our friendly open-source community on [Discord](https://discordapp.com/invite/DFSjPaX) :heart_eyes_cat:
|
||||
Join our friendly open-source community on [Discord](https://discord.gg/AJSX9DCSUA) :heart_eyes_cat:
|
||||
Just introduce yourself at `#introduce-yourself` and mention `@@Mentor` to get you onboard :neckbeard:
|
||||
Check out the [contribution guideline](./CONTRIBUTING.md), too!
|
||||
|
||||
[](https://sourcerer.io/fame/roschaefer/Human-Connection/Human-Connection/links/0)[](https://sourcerer.io/fame/roschaefer/Human-Connection/Human-Connection/links/1)[](https://sourcerer.io/fame/roschaefer/Human-Connection/Human-Connection/links/2)[](https://sourcerer.io/fame/roschaefer/Human-Connection/Human-Connection/links/3)[](https://sourcerer.io/fame/roschaefer/Human-Connection/Human-Connection/links/4)[](https://sourcerer.io/fame/roschaefer/Human-Connection/Human-Connection/links/5)[](https://sourcerer.io/fame/roschaefer/Human-Connection/Human-Connection/links/6)[](https://sourcerer.io/fame/roschaefer/Human-Connection/Human-Connection/links/7)
|
||||
We give write permissions to every developer who asks for it. Just text us on
|
||||
[Discord](https://discord.gg/AJSX9DCSUA).
|
||||
|
||||
## Open-Source Bounties
|
||||
## Technology Stack
|
||||
|
||||
You can get a small financial compensation for your contribution :moneybag: See
|
||||
details in our [Contribution Guidelines](./CONTRIBUTING.md#open-source-bounties).
|
||||
* [VueJS](https://vuejs.org/)
|
||||
* [NuxtJS](https://nuxtjs.org/)
|
||||
* [GraphQL](https://graphql.org/)
|
||||
* [NodeJS](https://nodejs.org/en/)
|
||||
* [Neo4J](https://neo4j.com/)
|
||||
|
||||
## Attributions
|
||||
|
||||
@ -66,4 +137,5 @@ Browser compatibility testing with [BrowserStack](https://www.browserstack.com/)
|
||||
<img alt="BrowserStack Logo" src=".gitbook/assets/browserstack-logo.svg" width="256">
|
||||
|
||||
## License
|
||||
|
||||
See the [LICENSE](LICENSE.md) file for license rights and limitations (MIT).
|
||||
|
||||
19
SUMMARY.md
19
SUMMARY.md
@ -2,7 +2,6 @@
|
||||
|
||||
* [Introduction](README.md)
|
||||
* [Edit this Documentation](edit-this-documentation.md)
|
||||
* [Installation](installation.md)
|
||||
* [Neo4J](neo4j/README.md)
|
||||
* [Backend](backend/README.md)
|
||||
* [GraphQL](backend/graphql.md)
|
||||
@ -16,24 +15,8 @@
|
||||
* [End-to-end tests](cypress/README.md)
|
||||
* [Frontend tests](webapp/testing.md)
|
||||
* [Backend tests](backend/testing.md)
|
||||
* [Deployment](https://github.com/Ocelot-Social-Community/Ocelot-Social-Deploy-Rebranding/blob/master/deployment/README.md)
|
||||
* [Contributing](CONTRIBUTING.md)
|
||||
* [Kubernetes Deployment](deployment/README.md)
|
||||
* [Minikube](deployment/minikube/README.md)
|
||||
* [Digital Ocean](deployment/digital-ocean/README.md)
|
||||
* [Kubernetes Dashboard](deployment/digital-ocean/dashboard/README.md)
|
||||
* [HTTPS](deployment/digital-ocean/https/README.md)
|
||||
* [ocelot.social](deployment/ocelot-social/README.md)
|
||||
* [Error Reporting](deployment/ocelot-social/error-reporting/README.md)
|
||||
* [Mailserver](deployment/ocelot-social/mailserver/README.md)
|
||||
* [Maintenance](deployment/ocelot-social/maintenance/README.md)
|
||||
* [Volumes](deployment/volumes/README.md)
|
||||
* [Neo4J Offline-Backups](deployment/volumes/neo4j-offline-backup/README.md)
|
||||
* [Neo4J Online-Backups](deployment/volumes/neo4j-online-backup/README.md)
|
||||
* [Volume Snapshots](deployment/volumes/volume-snapshots/README.md)
|
||||
* [Reclaim Policy](deployment/volumes/reclaim-policy/README.md)
|
||||
* [Velero](deployment/volumes/velero/README.md)
|
||||
* [Metrics](deployment/monitoring/README.md)
|
||||
* [Legacy Migration](deployment/legacy-migration/README.md)
|
||||
* [Feature Specification](cypress/features.md)
|
||||
* [Code of conduct](CODE_OF_CONDUCT.md)
|
||||
* [License](LICENSE.md)
|
||||
|
||||
@ -10,6 +10,7 @@ SMTP_USERNAME=
|
||||
SMTP_PASSWORD=
|
||||
|
||||
JWT_SECRET="b/&&7b78BF&fv/Vd"
|
||||
JWT_EXPIRES="2y"
|
||||
MAPBOX_TOKEN="pk.eyJ1IjoiYnVzZmFrdG9yIiwiYSI6ImNraDNiM3JxcDBhaWQydG1uczhpZWtpOW4ifQ.7TNRTO-o9aK1Y6MyW_Nd4g"
|
||||
|
||||
PRIVATE_KEY_PASSPHRASE="a7dsf78sadg87ad87sfagsadg78"
|
||||
@ -17,6 +18,7 @@ PRIVATE_KEY_PASSPHRASE="a7dsf78sadg87ad87sfagsadg78"
|
||||
SENTRY_DSN_BACKEND=
|
||||
COMMIT=
|
||||
PUBLIC_REGISTRATION=false
|
||||
INVITE_REGISTRATION=true
|
||||
|
||||
AWS_ACCESS_KEY_ID=
|
||||
AWS_SECRET_ACCESS_KEY=
|
||||
|
||||
@ -1,28 +1,102 @@
|
||||
##################################################################################
|
||||
# BASE (Is pushed to DockerHub for rebranding) ###################################
|
||||
##################################################################################
|
||||
FROM node:12.19.0-alpine3.10 as base
|
||||
LABEL Description="Backend of the Social Network ocelot.social" Vendor="ocelot.social Community" Version="0.0.1" Maintainer="ocelot.social Community (devops@ocelot.social)"
|
||||
|
||||
EXPOSE 4000
|
||||
CMD ["yarn", "run", "start"]
|
||||
ARG BUILD_COMMIT
|
||||
ENV BUILD_COMMIT=$BUILD_COMMIT
|
||||
ARG WORKDIR=/develop-backend
|
||||
RUN mkdir -p $WORKDIR
|
||||
WORKDIR $WORKDIR
|
||||
# ENVs
|
||||
## DOCKER_WORKDIR would be a classical ARG, but that is not multi layer persistent - shame
|
||||
ENV DOCKER_WORKDIR="/app"
|
||||
## We Cannot do `$(date -u +'%Y-%m-%dT%H:%M:%SZ')` here so we use unix timestamp=0
|
||||
ARG BBUILD_DATE="1970-01-01T00:00:00.00Z"
|
||||
ENV BUILD_DATE=$BBUILD_DATE
|
||||
## We cannot do $(yarn run version)-${BUILD_NUMBER} here so we default to 0.0.0-0
|
||||
ARG BBUILD_VERSION="0.0.0-0"
|
||||
ENV BUILD_VERSION=$BBUILD_VERSION
|
||||
## We cannot do `$(git rev-parse --short HEAD)` here so we default to 0000000
|
||||
ARG BBUILD_COMMIT="0000000"
|
||||
ENV BUILD_COMMIT=$BBUILD_COMMIT
|
||||
## SET NODE_ENV
|
||||
ENV NODE_ENV="production"
|
||||
## App relevant Envs
|
||||
ENV PORT="4000"
|
||||
|
||||
# Labels
|
||||
LABEL org.label-schema.build-date="${BUILD_DATE}"
|
||||
LABEL org.label-schema.name="ocelot.social:backend"
|
||||
LABEL org.label-schema.description="Backend of the Social Network Software ocelot.social"
|
||||
LABEL org.label-schema.usage="https://github.com/Ocelot-Social-Community/Ocelot-Social/blob/master/README.md"
|
||||
LABEL org.label-schema.url="https://ocelot.social"
|
||||
LABEL org.label-schema.vcs-url="https://github.com/Ocelot-Social-Community/Ocelot-Social/tree/master/backend"
|
||||
LABEL org.label-schema.vcs-ref="${BUILD_COMMIT}"
|
||||
LABEL org.label-schema.vendor="ocelot.social Community"
|
||||
LABEL org.label-schema.version="${BUILD_VERSION}"
|
||||
LABEL org.label-schema.schema-version="1.0"
|
||||
LABEL maintainer="devops@ocelot.social"
|
||||
|
||||
# Install Additional Software
|
||||
## install: git
|
||||
RUN apk --no-cache add git
|
||||
|
||||
COPY package.json yarn.lock ./
|
||||
COPY .env.template .env
|
||||
# Settings
|
||||
## Expose Container Port
|
||||
EXPOSE ${PORT}
|
||||
|
||||
FROM base as build-and-test
|
||||
RUN yarn install --production=false --frozen-lockfile --non-interactive
|
||||
## Workdir
|
||||
RUN mkdir -p ${DOCKER_WORKDIR}
|
||||
WORKDIR ${DOCKER_WORKDIR}
|
||||
|
||||
##################################################################################
|
||||
# DEVELOPMENT (Connected to the local environment, to reload on demand) ##########
|
||||
##################################################################################
|
||||
FROM base as development
|
||||
|
||||
# We don't need to copy or build anything since we gonna bind to the
|
||||
# local filesystem which will need a rebuild anyway
|
||||
|
||||
# Run command
|
||||
# (for development we need to execute yarn install since the
|
||||
# node_modules are on another volume and need updating)
|
||||
CMD /bin/sh -c "yarn install && yarn run dev"
|
||||
|
||||
##################################################################################
|
||||
# CODE (Does contain all code files and is pushed to DockerHub for rebranding) ###
|
||||
##################################################################################
|
||||
FROM base as code
|
||||
|
||||
# copy everything, but do not build.
|
||||
COPY . .
|
||||
RUN NODE_ENV=production yarn run build
|
||||
|
||||
# reduce image size with a multistage build
|
||||
##################################################################################
|
||||
# BUILD (Does contain all files and the compilate and is therefore bloated) ######
|
||||
##################################################################################
|
||||
FROM code as build
|
||||
|
||||
# yarn install
|
||||
RUN yarn install --production=false --frozen-lockfile --non-interactive
|
||||
# yarn build
|
||||
RUN yarn run build
|
||||
|
||||
##################################################################################
|
||||
# TEST ###########################################################################
|
||||
##################################################################################
|
||||
FROM build as test
|
||||
|
||||
# Run command
|
||||
CMD /bin/sh -c "yarn run dev"
|
||||
|
||||
##################################################################################
|
||||
# PRODUCTION (Does contain only "binary"- and static-files to reduce image size) #
|
||||
##################################################################################
|
||||
FROM base as production
|
||||
ENV NODE_ENV=production
|
||||
COPY --from=build-and-test /develop-backend/dist ./dist
|
||||
COPY ./public/img/ ./public/img/
|
||||
COPY ./public/providers.json ./public/providers.json
|
||||
RUN yarn install --production=true --frozen-lockfile --non-interactive --no-cache
|
||||
|
||||
# Copy "binary"-files from build image
|
||||
COPY --from=build ${DOCKER_WORKDIR}/dist ./dist
|
||||
# Copy static files
|
||||
# TODO - externalize the uploads so we can copy the whole folder
|
||||
COPY --from=build ${DOCKER_WORKDIR}/public/img/ ./public/img/
|
||||
COPY --from=build ${DOCKER_WORKDIR}/public/providers.json ./public/providers.json
|
||||
# Copy package.json for script definitions (lock file should not be needed)
|
||||
COPY --from=build ${DOCKER_WORKDIR}/package.json ./package.json
|
||||
|
||||
# Run command
|
||||
CMD /bin/sh -c "yarn run start"
|
||||
|
||||
@ -178,32 +178,20 @@ database after each test, running the tests will wipe out all your data!
|
||||
{% tabs %}
|
||||
{% tab title="Docker" %}
|
||||
|
||||
Run the _**jest**_ tests:
|
||||
Run the unit tests:
|
||||
|
||||
```bash
|
||||
$ docker-compose exec backend yarn run test:jest
|
||||
```
|
||||
|
||||
Run the _**cucumber**_ features:
|
||||
|
||||
```bash
|
||||
$ docker-compose exec backend yarn run test:cucumber
|
||||
$ docker-compose exec backend yarn run test
|
||||
```
|
||||
|
||||
{% endtab %}
|
||||
|
||||
{% tab title="Without Docker" %}
|
||||
|
||||
Run the _**jest**_ tests:
|
||||
Run the unit tests:
|
||||
|
||||
```bash
|
||||
$ yarn run test:jest
|
||||
```
|
||||
|
||||
Run the _**cucumber**_ features:
|
||||
|
||||
```bash
|
||||
$ yarn run test:cucumber
|
||||
$ yarn run test
|
||||
```
|
||||
|
||||
{% endtab %}
|
||||
|
||||
@ -1,6 +1,6 @@
|
||||
{
|
||||
"name": "ocelot-social-backend",
|
||||
"version": "0.6.3",
|
||||
"version": "1.0.4",
|
||||
"description": "GraphQL Backend for ocelot.social",
|
||||
"repository": "https://github.com/Ocelot-Social-Community/Ocelot-Social",
|
||||
"author": "ocelot.social Community",
|
||||
@ -15,7 +15,7 @@
|
||||
"dev": "nodemon --exec babel-node src/ -e js,gql",
|
||||
"dev:debug": "nodemon --exec babel-node --inspect=0.0.0.0:9229 src/ -e js,gql",
|
||||
"lint": "eslint src --config .eslintrc.js",
|
||||
"test": "jest --forceExit --detectOpenHandles --runInBand",
|
||||
"test": "cross-env NODE_ENV=test jest --forceExit --detectOpenHandles --runInBand --coverage",
|
||||
"db:clean": "babel-node src/db/clean.js",
|
||||
"db:reset": "yarn run db:clean",
|
||||
"db:seed": "babel-node src/db/seed.js",
|
||||
@ -39,6 +39,12 @@
|
||||
]
|
||||
},
|
||||
"dependencies": {
|
||||
"@babel/cli": "~7.8.4",
|
||||
"@babel/core": "~7.9.0",
|
||||
"@babel/node": "~7.8.7",
|
||||
"@babel/plugin-proposal-throw-expressions": "^7.8.3",
|
||||
"@babel/preset-env": "~7.9.5",
|
||||
"@babel/register": "^7.9.0",
|
||||
"@hapi/joi": "^17.1.1",
|
||||
"@sentry/node": "^5.15.4",
|
||||
"apollo-cache-inmemory": "~1.6.5",
|
||||
@ -48,12 +54,15 @@
|
||||
"apollo-server": "~2.14.2",
|
||||
"apollo-server-express": "^2.14.2",
|
||||
"aws-sdk": "^2.652.0",
|
||||
"babel-core": "~7.0.0-0",
|
||||
"babel-eslint": "~10.1.0",
|
||||
"babel-jest": "~25.2.6",
|
||||
"babel-plugin-transform-runtime": "^6.23.0",
|
||||
"bcryptjs": "~2.4.3",
|
||||
"cheerio": "~1.0.0-rc.3",
|
||||
"cors": "~2.8.5",
|
||||
"cross-env": "~7.0.2",
|
||||
"date-fns": "2.11.1",
|
||||
"date-fns": "2.22.1",
|
||||
"debug": "~4.1.1",
|
||||
"dotenv": "~8.2.0",
|
||||
"express": "^4.17.1",
|
||||
@ -69,9 +78,10 @@
|
||||
"helmet": "~3.22.0",
|
||||
"ioredis": "^4.16.1",
|
||||
"jsonwebtoken": "~8.5.1",
|
||||
"languagedetect": "^2.0.0",
|
||||
"linkifyjs": "~2.1.8",
|
||||
"lodash": "~4.17.14",
|
||||
"merge-graphql-schemas": "^1.7.7",
|
||||
"merge-graphql-schemas": "^1.7.8",
|
||||
"metascraper": "^5.11.8",
|
||||
"metascraper-audio": "^5.14.26",
|
||||
"metascraper-author": "^5.14.22",
|
||||
@ -91,7 +101,7 @@
|
||||
"migrate": "^1.7.0",
|
||||
"mime-types": "^2.1.26",
|
||||
"minimatch": "^3.0.4",
|
||||
"mustache": "^4.0.1",
|
||||
"mustache": "^4.2.0",
|
||||
"neo4j-driver": "^4.0.2",
|
||||
"neo4j-graphql-js": "^2.11.5",
|
||||
"neode": "^0.3.7",
|
||||
@ -110,16 +120,7 @@
|
||||
"xregexp": "^4.3.0"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@babel/cli": "~7.8.4",
|
||||
"@babel/core": "~7.9.0",
|
||||
"@babel/node": "~7.8.7",
|
||||
"@babel/plugin-proposal-throw-expressions": "^7.8.3",
|
||||
"@babel/preset-env": "~7.9.5",
|
||||
"@babel/register": "^7.9.0",
|
||||
"apollo-server-testing": "~2.11.0",
|
||||
"babel-core": "~7.0.0-0",
|
||||
"babel-eslint": "~10.1.0",
|
||||
"babel-jest": "~25.2.6",
|
||||
"chai": "~4.2.0",
|
||||
"cucumber": "~6.0.5",
|
||||
"eslint": "~6.8.0",
|
||||
@ -133,7 +134,7 @@
|
||||
"eslint-plugin-standard": "~4.0.1",
|
||||
"jest": "~25.3.0",
|
||||
"nodemon": "~2.0.2",
|
||||
"prettier": "~2.2.0",
|
||||
"prettier": "~2.3.2",
|
||||
"rosie": "^2.0.1",
|
||||
"supertest": "~4.0.2"
|
||||
},
|
||||
|
||||
@ -1,6 +1,7 @@
|
||||
import { handler } from './webfinger'
|
||||
import Factory, { cleanDatabase } from '../../db/factories'
|
||||
import { getDriver } from '../../db/neo4j'
|
||||
import CONFIG from '../../config'
|
||||
|
||||
let resource, res, json, status, contentType
|
||||
|
||||
@ -98,12 +99,12 @@ describe('webfinger', () => {
|
||||
expect(json).toHaveBeenCalledWith({
|
||||
links: [
|
||||
{
|
||||
href: 'http://localhost:3000/activitypub/users/some-user',
|
||||
href: `${CONFIG.CLIENT_URI}/activitypub/users/some-user`,
|
||||
rel: 'self',
|
||||
type: 'application/activity+json',
|
||||
},
|
||||
],
|
||||
subject: 'acct:some-user@localhost:3000',
|
||||
subject: `acct:some-user@${new URL(CONFIG.CLIENT_URI).host}`,
|
||||
})
|
||||
})
|
||||
})
|
||||
|
||||
@ -2,111 +2,108 @@ import dotenv from 'dotenv'
|
||||
import links from './links.js'
|
||||
import metadata from './metadata.js'
|
||||
|
||||
// Load env file
|
||||
if (require.resolve) {
|
||||
// are we in a nodejs environment?
|
||||
try {
|
||||
dotenv.config({ path: require.resolve('../../.env') })
|
||||
} catch (error) {
|
||||
if (error.code !== 'MODULE_NOT_FOUND') throw error
|
||||
console.log('WARN: No `.env` file found in /backend') // eslint-disable-line no-console
|
||||
// This error is thrown when the .env is not found
|
||||
if (error.code !== 'MODULE_NOT_FOUND') {
|
||||
throw error
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// eslint-disable-next-line no-undef
|
||||
const env = typeof Cypress !== 'undefined' ? Cypress.env() : process.env
|
||||
// Use Cypress env or process.env
|
||||
const env = typeof Cypress !== 'undefined' ? Cypress.env() : process.env // eslint-disable-line no-undef
|
||||
|
||||
const {
|
||||
MAPBOX_TOKEN,
|
||||
JWT_SECRET,
|
||||
PRIVATE_KEY_PASSPHRASE,
|
||||
SMTP_IGNORE_TLS = true,
|
||||
SMTP_HOST,
|
||||
SMTP_PORT,
|
||||
SMTP_USERNAME,
|
||||
SMTP_PASSWORD,
|
||||
SENTRY_DSN_BACKEND,
|
||||
COMMIT,
|
||||
AWS_ACCESS_KEY_ID,
|
||||
AWS_SECRET_ACCESS_KEY,
|
||||
AWS_ENDPOINT,
|
||||
AWS_REGION,
|
||||
AWS_BUCKET,
|
||||
NEO4J_URI = 'bolt://localhost:7687',
|
||||
NEO4J_USERNAME = 'neo4j',
|
||||
NEO4J_PASSWORD = 'neo4j',
|
||||
CLIENT_URI = 'http://localhost:3000',
|
||||
GRAPHQL_URI = 'http://localhost:4000',
|
||||
REDIS_DOMAIN,
|
||||
REDIS_PORT,
|
||||
REDIS_PASSWORD,
|
||||
EMAIL_DEFAULT_SENDER,
|
||||
} = env
|
||||
|
||||
export const requiredConfigs = {
|
||||
MAPBOX_TOKEN,
|
||||
JWT_SECRET,
|
||||
PRIVATE_KEY_PASSPHRASE,
|
||||
const environment = {
|
||||
NODE_ENV: env.NODE_ENV || process.NODE_ENV,
|
||||
DEBUG: env.NODE_ENV !== 'production' && env.DEBUG,
|
||||
TEST: env.NODE_ENV === 'test',
|
||||
PRODUCTION: env.NODE_ENV === 'production',
|
||||
DISABLED_MIDDLEWARES: (env.NODE_ENV !== 'production' && env.DISABLED_MIDDLEWARES) || false,
|
||||
}
|
||||
|
||||
const required = {
|
||||
MAPBOX_TOKEN: env.MAPBOX_TOKEN,
|
||||
JWT_SECRET: env.JWT_SECRET,
|
||||
PRIVATE_KEY_PASSPHRASE: env.PRIVATE_KEY_PASSPHRASE,
|
||||
}
|
||||
|
||||
const server = {
|
||||
CLIENT_URI: env.CLIENT_URI || 'http://localhost:3000',
|
||||
GRAPHQL_URI: env.GRAPHQL_URI || 'http://localhost:4000',
|
||||
JWT_EXPIRES: env.JWT_EXPIRES || '2y',
|
||||
}
|
||||
|
||||
const smtp = {
|
||||
SMTP_HOST: env.SMTP_HOST,
|
||||
SMTP_PORT: env.SMTP_PORT,
|
||||
SMTP_IGNORE_TLS: env.SMTP_IGNORE_TLS !== 'false', // default = true
|
||||
SMTP_SECURE: env.SMTP_SECURE === 'true',
|
||||
SMTP_USERNAME: env.SMTP_USERNAME,
|
||||
SMTP_PASSWORD: env.SMTP_PASSWORD,
|
||||
}
|
||||
|
||||
const neo4j = {
|
||||
NEO4J_URI: env.NEO4J_URI || 'bolt://localhost:7687',
|
||||
NEO4J_USERNAME: env.NEO4J_USERNAME || 'neo4j',
|
||||
NEO4J_PASSWORD: env.NEO4J_PASSWORD || 'neo4j',
|
||||
}
|
||||
|
||||
const sentry = {
|
||||
SENTRY_DSN_BACKEND: env.SENTRY_DSN_BACKEND,
|
||||
COMMIT: env.COMMIT,
|
||||
}
|
||||
|
||||
const redis = {
|
||||
REDIS_DOMAIN: env.REDIS_DOMAIN,
|
||||
REDIS_PORT: env.REDIS_PORT,
|
||||
REDIS_PASSWORD: env.REDIS_PASSWORD,
|
||||
}
|
||||
|
||||
const s3 = {
|
||||
AWS_ACCESS_KEY_ID: env.AWS_ACCESS_KEY_ID,
|
||||
AWS_SECRET_ACCESS_KEY: env.AWS_SECRET_ACCESS_KEY,
|
||||
AWS_ENDPOINT: env.AWS_ENDPOINT,
|
||||
AWS_REGION: env.AWS_REGION,
|
||||
AWS_BUCKET: env.AWS_BUCKET,
|
||||
S3_CONFIGURED:
|
||||
env.AWS_ACCESS_KEY_ID &&
|
||||
env.AWS_SECRET_ACCESS_KEY &&
|
||||
env.AWS_ENDPOINT &&
|
||||
env.AWS_REGION &&
|
||||
env.AWS_BUCKET,
|
||||
}
|
||||
|
||||
const options = {
|
||||
EMAIL_DEFAULT_SENDER: env.EMAIL_DEFAULT_SENDER,
|
||||
SUPPORT_URL: links.SUPPORT,
|
||||
APPLICATION_NAME: metadata.APPLICATION_NAME,
|
||||
ORGANIZATION_URL: links.ORGANIZATION,
|
||||
PUBLIC_REGISTRATION: env.PUBLIC_REGISTRATION === 'true' || false,
|
||||
INVITE_REGISTRATION: env.INVITE_REGISTRATION !== 'false', // default = true
|
||||
}
|
||||
|
||||
// Check if all required configs are present
|
||||
if (require.resolve) {
|
||||
// are we in a nodejs environment?
|
||||
Object.entries(requiredConfigs).map((entry) => {
|
||||
Object.entries(required).map((entry) => {
|
||||
if (!entry[1]) {
|
||||
throw new Error(`ERROR: "${entry[0]}" env variable is missing.`)
|
||||
}
|
||||
})
|
||||
}
|
||||
|
||||
export const smtpConfigs = {
|
||||
SMTP_HOST,
|
||||
SMTP_PORT,
|
||||
SMTP_IGNORE_TLS,
|
||||
SMTP_USERNAME,
|
||||
SMTP_PASSWORD,
|
||||
}
|
||||
export const neo4jConfigs = { NEO4J_URI, NEO4J_USERNAME, NEO4J_PASSWORD }
|
||||
export const serverConfigs = {
|
||||
CLIENT_URI,
|
||||
GRAPHQL_URI,
|
||||
PUBLIC_REGISTRATION: process.env.PUBLIC_REGISTRATION === 'true',
|
||||
}
|
||||
|
||||
export const developmentConfigs = {
|
||||
DEBUG: process.env.NODE_ENV !== 'production' && process.env.DEBUG,
|
||||
DISABLED_MIDDLEWARES:
|
||||
(process.env.NODE_ENV !== 'production' && process.env.DISABLED_MIDDLEWARES) || '',
|
||||
}
|
||||
|
||||
export const sentryConfigs = { SENTRY_DSN_BACKEND, COMMIT }
|
||||
export const redisConfigs = { REDIS_DOMAIN, REDIS_PORT, REDIS_PASSWORD }
|
||||
|
||||
const S3_CONFIGURED =
|
||||
AWS_ACCESS_KEY_ID && AWS_SECRET_ACCESS_KEY && AWS_ENDPOINT && AWS_REGION && AWS_BUCKET
|
||||
|
||||
export const s3Configs = {
|
||||
AWS_ACCESS_KEY_ID,
|
||||
AWS_SECRET_ACCESS_KEY,
|
||||
AWS_ENDPOINT,
|
||||
AWS_REGION,
|
||||
AWS_BUCKET,
|
||||
S3_CONFIGURED,
|
||||
}
|
||||
|
||||
export const customConfigs = {
|
||||
EMAIL_DEFAULT_SENDER,
|
||||
SUPPORT_URL: links.SUPPORT,
|
||||
APPLICATION_NAME: metadata.APPLICATION_NAME,
|
||||
ORGANIZATION_URL: links.ORGANIZATION,
|
||||
}
|
||||
|
||||
export default {
|
||||
...requiredConfigs,
|
||||
...smtpConfigs,
|
||||
...neo4jConfigs,
|
||||
...serverConfigs,
|
||||
...developmentConfigs,
|
||||
...sentryConfigs,
|
||||
...redisConfigs,
|
||||
...s3Configs,
|
||||
...customConfigs,
|
||||
...environment,
|
||||
...server,
|
||||
...required,
|
||||
...smtp,
|
||||
...neo4j,
|
||||
...sentry,
|
||||
...redis,
|
||||
...s3,
|
||||
...options,
|
||||
}
|
||||
|
||||
@ -1,6 +1,13 @@
|
||||
// this file is duplicated in `backend/src/config/links.js` and `webapp/constants/links.js` and replaced on rebranding
|
||||
export default {
|
||||
ORGANIZATION: 'https://ocelot.social',
|
||||
DONATE: 'https://ocelot-social.herokuapp.com/donations',
|
||||
FAQ: 'https://ocelot.social',
|
||||
SUPPORT: 'https://ocelot.social',
|
||||
|
||||
// on null or empty strings internal imprint is used, see 'webapp/locales/html/'
|
||||
DONATE: 'https://ocelot-social.herokuapp.com/donations', // we use 'ocelot-social.herokuapp.com' at the moment, because redirections of 'ocelot.social' subpages are not working correctly
|
||||
IMPRINT: 'https://ocelot-social.herokuapp.com/imprint', // we use 'ocelot-social.herokuapp.com' at the moment, because redirections of 'ocelot.social' subpages are not working correctly
|
||||
TERMS_AND_CONDITIONS: null,
|
||||
CODE_OF_CONDUCT: null,
|
||||
DATA_PRIVACY: null,
|
||||
FAQ: 'https://ocelot.social',
|
||||
}
|
||||
|
||||
10
backend/src/config/logos.js
Normal file
10
backend/src/config/logos.js
Normal file
@ -0,0 +1,10 @@
|
||||
// this file is duplicated in `backend/src/config/logos.js` and `webapp/constants/logos.js` and replaced on rebranding
|
||||
// this are the paths in the webapp
|
||||
export default {
|
||||
LOGO_HEADER_PATH: '/img/custom/logo-horizontal.svg',
|
||||
LOGO_SIGNUP_PATH: '/img/custom/logo-squared.svg',
|
||||
LOGO_WELCOME_PATH: '/img/custom/logo-squared.svg',
|
||||
LOGO_LOGOUT_PATH: '/img/custom/logo-squared.svg',
|
||||
LOGO_PASSWORD_RESET_PATH: '/img/custom/logo-squared.svg',
|
||||
LOGO_MAINTENACE_RESET_PATH: '/img/custom/logo-squared.svg',
|
||||
}
|
||||
@ -1,7 +1,9 @@
|
||||
// this file is duplicated in `backend/src/config/metadata.js` and `webapp/constants/metadata.js` and replaced on rebranding
|
||||
export default {
|
||||
APPLICATION_NAME: 'ocelot.social',
|
||||
APPLICATION_SHORT_NAME: 'ocelot',
|
||||
APPLICATION_DESCRIPTION: 'ocelot.social Community Network',
|
||||
COOKIE_NAME: 'ocelot-social-token',
|
||||
ORGANIZATION_NAME: 'ocelot.social Community',
|
||||
ORGANIZATION_JURISDICTION: 'City of Angels',
|
||||
}
|
||||
|
||||
@ -1,6 +1,7 @@
|
||||
import { cleanDatabase } from '../db/factories'
|
||||
import CONFIG from '../config'
|
||||
|
||||
if (process.env.NODE_ENV === 'production') {
|
||||
if (CONFIG.PRODUCTION) {
|
||||
throw new Error(`You cannot clean the database in production environment!`)
|
||||
}
|
||||
|
||||
|
||||
@ -5,6 +5,7 @@ import { hashSync } from 'bcryptjs'
|
||||
import { Factory } from 'rosie'
|
||||
import { getDriver, getNeode } from './neo4j'
|
||||
import CONFIG from '../config/index.js'
|
||||
import generateInviteCode from '../schema/resolvers/helpers/generateInviteCode.js'
|
||||
|
||||
const neode = getNeode()
|
||||
|
||||
@ -48,8 +49,9 @@ Factory.define('badge')
|
||||
|
||||
Factory.define('image')
|
||||
.attr('url', faker.image.unsplash.imageUrl)
|
||||
.attr('aspectRatio', 1)
|
||||
.attr('aspectRatio', 1.3333333333333333)
|
||||
.attr('alt', faker.lorem.sentence)
|
||||
.attr('type', 'image/jpeg')
|
||||
.after((buildObject, options) => {
|
||||
const { url: imageUrl } = buildObject
|
||||
if (imageUrl) buildObject.url = uniqueImageUrl(imageUrl)
|
||||
@ -103,12 +105,12 @@ Factory.define('user')
|
||||
})
|
||||
|
||||
Factory.define('post')
|
||||
.option('categoryIds', [])
|
||||
/* .option('categoryIds', [])
|
||||
.option('categories', ['categoryIds'], (categoryIds) => {
|
||||
if (categoryIds.length) return Promise.all(categoryIds.map((id) => neode.find('Category', id)))
|
||||
// there must be at least one category
|
||||
return Promise.all([Factory.build('category')])
|
||||
})
|
||||
}) */
|
||||
.option('tagIds', [])
|
||||
.option('tags', ['tagIds'], (tagIds) => {
|
||||
return Promise.all(tagIds.map((id) => neode.find('Tag', id)))
|
||||
@ -128,6 +130,8 @@ Factory.define('post')
|
||||
deleted: false,
|
||||
imageBlurred: false,
|
||||
imageAspectRatio: 1.333,
|
||||
clickedCount: 0,
|
||||
viewedTeaserCount: 0,
|
||||
})
|
||||
.attr('pinned', ['pinned'], (pinned) => {
|
||||
// Convert false to null
|
||||
@ -143,16 +147,16 @@ Factory.define('post')
|
||||
return language || 'en'
|
||||
})
|
||||
.after(async (buildObject, options) => {
|
||||
const [post, author, image, categories, tags] = await Promise.all([
|
||||
const [post, author, image, /* categories, */ tags] = await Promise.all([
|
||||
neode.create('Post', buildObject),
|
||||
options.author,
|
||||
options.image,
|
||||
options.categories,
|
||||
// options.categories,
|
||||
options.tags,
|
||||
])
|
||||
await Promise.all([
|
||||
post.relateTo(author, 'author'),
|
||||
Promise.all(categories.map((c) => c.relateTo(post, 'post'))),
|
||||
// Promise.all(categories.map((c) => c.relateTo(post, 'post'))),
|
||||
Promise.all(tags.map((t) => t.relateTo(post, 'post'))),
|
||||
])
|
||||
if (image) await post.relateTo(image, 'image')
|
||||
@ -205,7 +209,7 @@ const emailDefaults = {
|
||||
}
|
||||
|
||||
Factory.define('emailAddress')
|
||||
.attr(emailDefaults)
|
||||
.attrs(emailDefaults)
|
||||
.after((buildObject, options) => {
|
||||
return neode.create('EmailAddress', buildObject)
|
||||
})
|
||||
@ -216,6 +220,28 @@ Factory.define('unverifiedEmailAddress')
|
||||
return neode.create('UnverifiedEmailAddress', buildObject)
|
||||
})
|
||||
|
||||
const inviteCodeDefaults = {
|
||||
code: () => generateInviteCode(),
|
||||
createdAt: () => new Date().toISOString(),
|
||||
expiresAt: () => null,
|
||||
}
|
||||
|
||||
Factory.define('inviteCode')
|
||||
.attrs(inviteCodeDefaults)
|
||||
.option('generatedById', null)
|
||||
.option('generatedBy', ['generatedById'], (generatedById) => {
|
||||
if (generatedById) return neode.find('User', generatedById)
|
||||
return Factory.build('user')
|
||||
})
|
||||
.after(async (buildObject, options) => {
|
||||
const [inviteCode, generatedBy] = await Promise.all([
|
||||
neode.create('InviteCode', buildObject),
|
||||
options.generatedBy,
|
||||
])
|
||||
await Promise.all([inviteCode.relateTo(generatedBy, 'generated')])
|
||||
return inviteCode
|
||||
})
|
||||
|
||||
Factory.define('location')
|
||||
.attrs({
|
||||
name: 'Germany',
|
||||
|
||||
9
backend/src/db/migrations/1613589876420-null_mutation.js
Normal file
9
backend/src/db/migrations/1613589876420-null_mutation.js
Normal file
@ -0,0 +1,9 @@
|
||||
'use strict'
|
||||
|
||||
module.exports.up = function (next) {
|
||||
next()
|
||||
}
|
||||
|
||||
module.exports.down = function (next) {
|
||||
next()
|
||||
}
|
||||
@ -0,0 +1,53 @@
|
||||
import { getDriver } from '../../db/neo4j'
|
||||
|
||||
export const description = `
|
||||
This migration adds the clickedCount property to all posts, setting it to 0.
|
||||
`
|
||||
|
||||
module.exports.up = async function (next) {
|
||||
const driver = getDriver()
|
||||
const session = driver.session()
|
||||
const transaction = session.beginTransaction()
|
||||
try {
|
||||
// Implement your migration here.
|
||||
await transaction.run(`
|
||||
MATCH (p:Post)
|
||||
SET p.clickedCount = 0
|
||||
`)
|
||||
await transaction.commit()
|
||||
next()
|
||||
} catch (error) {
|
||||
// eslint-disable-next-line no-console
|
||||
console.log(error)
|
||||
await transaction.rollback()
|
||||
// eslint-disable-next-line no-console
|
||||
console.log('rolled back')
|
||||
throw new Error(error)
|
||||
} finally {
|
||||
session.close()
|
||||
}
|
||||
}
|
||||
|
||||
module.exports.down = async function (next) {
|
||||
const driver = getDriver()
|
||||
const session = driver.session()
|
||||
const transaction = session.beginTransaction()
|
||||
try {
|
||||
// Implement your migration here.
|
||||
await transaction.run(`
|
||||
MATCH (p:Post)
|
||||
REMOVE p.clickedCount
|
||||
`)
|
||||
await transaction.commit()
|
||||
next()
|
||||
} catch (error) {
|
||||
// eslint-disable-next-line no-console
|
||||
console.log(error)
|
||||
await transaction.rollback()
|
||||
// eslint-disable-next-line no-console
|
||||
console.log('rolled back')
|
||||
throw new Error(error)
|
||||
} finally {
|
||||
session.close()
|
||||
}
|
||||
}
|
||||
@ -0,0 +1,53 @@
|
||||
import { getDriver } from '../../db/neo4j'
|
||||
|
||||
export const description = `
|
||||
This migration adds the viewedTeaserCount property to all posts, setting it to 0.
|
||||
`
|
||||
|
||||
module.exports.up = async function (next) {
|
||||
const driver = getDriver()
|
||||
const session = driver.session()
|
||||
const transaction = session.beginTransaction()
|
||||
try {
|
||||
// Implement your migration here.
|
||||
await transaction.run(`
|
||||
MATCH (p:Post)
|
||||
SET p.viewedTeaserCount = 0
|
||||
`)
|
||||
await transaction.commit()
|
||||
next()
|
||||
} catch (error) {
|
||||
// eslint-disable-next-line no-console
|
||||
console.log(error)
|
||||
await transaction.rollback()
|
||||
// eslint-disable-next-line no-console
|
||||
console.log('rolled back')
|
||||
throw new Error(error)
|
||||
} finally {
|
||||
session.close()
|
||||
}
|
||||
}
|
||||
|
||||
module.exports.down = async function (next) {
|
||||
const driver = getDriver()
|
||||
const session = driver.session()
|
||||
const transaction = session.beginTransaction()
|
||||
try {
|
||||
// Implement your migration here.
|
||||
await transaction.run(`
|
||||
MATCH (p:Post)
|
||||
REMOVE p.viewedTeaserCount
|
||||
`)
|
||||
await transaction.commit()
|
||||
next()
|
||||
} catch (error) {
|
||||
// eslint-disable-next-line no-console
|
||||
console.log(error)
|
||||
await transaction.rollback()
|
||||
// eslint-disable-next-line no-console
|
||||
console.log('rolled back')
|
||||
throw new Error(error)
|
||||
} finally {
|
||||
session.close()
|
||||
}
|
||||
}
|
||||
@ -137,100 +137,93 @@ const languages = ['de', 'en', 'es', 'fr', 'it', 'pt', 'pl']
|
||||
}),
|
||||
])
|
||||
|
||||
const [
|
||||
peterLustig,
|
||||
bobDerBaumeister,
|
||||
jennyRostock,
|
||||
huey,
|
||||
dewey,
|
||||
louie,
|
||||
dagobert,
|
||||
] = await Promise.all([
|
||||
Factory.build(
|
||||
'user',
|
||||
{
|
||||
id: 'u1',
|
||||
name: 'Peter Lustig',
|
||||
slug: 'peter-lustig',
|
||||
role: 'admin',
|
||||
},
|
||||
{
|
||||
email: 'admin@example.org',
|
||||
},
|
||||
),
|
||||
Factory.build(
|
||||
'user',
|
||||
{
|
||||
id: 'u2',
|
||||
name: 'Bob der Baumeister',
|
||||
slug: 'bob-der-baumeister',
|
||||
role: 'moderator',
|
||||
},
|
||||
{
|
||||
email: 'moderator@example.org',
|
||||
},
|
||||
),
|
||||
Factory.build(
|
||||
'user',
|
||||
{
|
||||
id: 'u3',
|
||||
name: 'Jenny Rostock',
|
||||
slug: 'jenny-rostock',
|
||||
role: 'user',
|
||||
},
|
||||
{
|
||||
email: 'user@example.org',
|
||||
},
|
||||
),
|
||||
Factory.build(
|
||||
'user',
|
||||
{
|
||||
id: 'u4',
|
||||
name: 'Huey',
|
||||
slug: 'huey',
|
||||
role: 'user',
|
||||
},
|
||||
{
|
||||
email: 'huey@example.org',
|
||||
},
|
||||
),
|
||||
Factory.build(
|
||||
'user',
|
||||
{
|
||||
id: 'u5',
|
||||
name: 'Dewey',
|
||||
slug: 'dewey',
|
||||
role: 'user',
|
||||
},
|
||||
{
|
||||
email: 'dewey@example.org',
|
||||
},
|
||||
),
|
||||
Factory.build(
|
||||
'user',
|
||||
{
|
||||
id: 'u6',
|
||||
name: 'Louie',
|
||||
slug: 'louie',
|
||||
role: 'user',
|
||||
},
|
||||
{
|
||||
email: 'louie@example.org',
|
||||
},
|
||||
),
|
||||
Factory.build(
|
||||
'user',
|
||||
{
|
||||
id: 'u7',
|
||||
name: 'Dagobert',
|
||||
slug: 'dagobert',
|
||||
role: 'user',
|
||||
},
|
||||
{
|
||||
email: 'dagobert@example.org',
|
||||
},
|
||||
),
|
||||
])
|
||||
const [peterLustig, bobDerBaumeister, jennyRostock, huey, dewey, louie, dagobert] =
|
||||
await Promise.all([
|
||||
Factory.build(
|
||||
'user',
|
||||
{
|
||||
id: 'u1',
|
||||
name: 'Peter Lustig',
|
||||
slug: 'peter-lustig',
|
||||
role: 'admin',
|
||||
},
|
||||
{
|
||||
email: 'admin@example.org',
|
||||
},
|
||||
),
|
||||
Factory.build(
|
||||
'user',
|
||||
{
|
||||
id: 'u2',
|
||||
name: 'Bob der Baumeister',
|
||||
slug: 'bob-der-baumeister',
|
||||
role: 'moderator',
|
||||
},
|
||||
{
|
||||
email: 'moderator@example.org',
|
||||
},
|
||||
),
|
||||
Factory.build(
|
||||
'user',
|
||||
{
|
||||
id: 'u3',
|
||||
name: 'Jenny Rostock',
|
||||
slug: 'jenny-rostock',
|
||||
role: 'user',
|
||||
},
|
||||
{
|
||||
email: 'user@example.org',
|
||||
},
|
||||
),
|
||||
Factory.build(
|
||||
'user',
|
||||
{
|
||||
id: 'u4',
|
||||
name: 'Huey',
|
||||
slug: 'huey',
|
||||
role: 'user',
|
||||
},
|
||||
{
|
||||
email: 'huey@example.org',
|
||||
},
|
||||
),
|
||||
Factory.build(
|
||||
'user',
|
||||
{
|
||||
id: 'u5',
|
||||
name: 'Dewey',
|
||||
slug: 'dewey',
|
||||
role: 'user',
|
||||
},
|
||||
{
|
||||
email: 'dewey@example.org',
|
||||
},
|
||||
),
|
||||
Factory.build(
|
||||
'user',
|
||||
{
|
||||
id: 'u6',
|
||||
name: 'Louie',
|
||||
slug: 'louie',
|
||||
role: 'user',
|
||||
},
|
||||
{
|
||||
email: 'louie@example.org',
|
||||
},
|
||||
),
|
||||
Factory.build(
|
||||
'user',
|
||||
{
|
||||
id: 'u7',
|
||||
name: 'Dagobert',
|
||||
slug: 'dagobert',
|
||||
role: 'user',
|
||||
},
|
||||
{
|
||||
email: 'dagobert@example.org',
|
||||
},
|
||||
),
|
||||
])
|
||||
|
||||
await Promise.all([
|
||||
peterLustig.relateTo(Berlin, 'isIn'),
|
||||
@ -541,6 +534,16 @@ const languages = ['de', 'en', 'es', 'fr', 'it', 'pt', 'pl']
|
||||
),
|
||||
])
|
||||
|
||||
await Factory.build(
|
||||
'inviteCode',
|
||||
{
|
||||
code: 'AAAAAA',
|
||||
},
|
||||
{
|
||||
generatedBy: jennyRostock,
|
||||
},
|
||||
)
|
||||
|
||||
authenticatedUser = await louie.toJson()
|
||||
const mention1 =
|
||||
'Hey <a class="mention" data-mention-id="u3" href="/profile/u3">@jenny-rostock</a>, what\'s up?'
|
||||
@ -551,7 +554,7 @@ const languages = ['de', 'en', 'es', 'fr', 'it', 'pt', 'pl']
|
||||
const hashtagAndMention1 =
|
||||
'The new physics of <a class="hashtag" data-hashtag-id="QuantenFlussTheorie" href="/?hashtag=QuantenFlussTheorie">#QuantenFlussTheorie</a> can explain <a class="hashtag" data-hashtag-id="QuantumGravity" href="/?hashtag=QuantumGravity">#QuantumGravity</a>! <a class="mention" data-mention-id="u1" href="/profile/u1">@peter-lustig</a> got that already. ;-)'
|
||||
const createPostMutation = gql`
|
||||
mutation($id: ID, $title: String!, $content: String!, $categoryIds: [ID]) {
|
||||
mutation ($id: ID, $title: String!, $content: String!, $categoryIds: [ID]) {
|
||||
CreatePost(id: $id, title: $title, content: $content, categoryIds: $categoryIds) {
|
||||
id
|
||||
}
|
||||
@ -608,7 +611,7 @@ const languages = ['de', 'en', 'es', 'fr', 'it', 'pt', 'pl']
|
||||
const mentionInComment2 =
|
||||
'Did <a class="mention" data-mention-id="u1" href="/profile/u1">@peter-lustig</a> tell you?'
|
||||
const createCommentMutation = gql`
|
||||
mutation($id: ID, $postId: ID!, $content: String!) {
|
||||
mutation ($id: ID, $postId: ID!, $content: String!) {
|
||||
CreateComment(id: $id, postId: $postId, content: $content) {
|
||||
id
|
||||
}
|
||||
@ -931,6 +934,7 @@ const languages = ['de', 'en', 'es', 'fr', 'it', 'pt', 'pl']
|
||||
const additionalUsers = await Promise.all(
|
||||
[...Array(30).keys()].map(() => Factory.build('user')),
|
||||
)
|
||||
|
||||
await Promise.all(
|
||||
additionalUsers.map(async (user) => {
|
||||
await jennyRostock.relateTo(user, 'following')
|
||||
@ -938,6 +942,26 @@ const languages = ['de', 'en', 'es', 'fr', 'it', 'pt', 'pl']
|
||||
}),
|
||||
)
|
||||
|
||||
await Promise.all(
|
||||
[...Array(30).keys()].map((index) => Factory.build('user', { name: `Jenny${index}` })),
|
||||
)
|
||||
|
||||
await Promise.all(
|
||||
[...Array(30).keys()].map(() =>
|
||||
Factory.build(
|
||||
'post',
|
||||
{ content: `Jenny ${faker.lorem.sentence()}` },
|
||||
{
|
||||
categoryIds: ['cat1'],
|
||||
author: jennyRostock,
|
||||
image: Factory.build('image', {
|
||||
url: faker.image.unsplash.objects(),
|
||||
}),
|
||||
},
|
||||
),
|
||||
),
|
||||
)
|
||||
|
||||
await Promise.all(
|
||||
[...Array(30).keys()].map(() =>
|
||||
Factory.build(
|
||||
|
||||
@ -5,7 +5,7 @@ import CONFIG from './../config'
|
||||
export default function encode(user) {
|
||||
const { id, name, slug } = user
|
||||
const token = jwt.sign({ id, name, slug }, CONFIG.JWT_SECRET, {
|
||||
expiresIn: '1d',
|
||||
expiresIn: CONFIG.JWT_EXPIRES,
|
||||
issuer: CONFIG.GRAPHQL_URI,
|
||||
audience: CONFIG.CLIENT_URI,
|
||||
subject: user.id.toString(),
|
||||
|
||||
@ -13,7 +13,7 @@ const hasAuthData = CONFIG.SMTP_USERNAME && CONFIG.SMTP_PASSWORD
|
||||
|
||||
let sendMail = () => {}
|
||||
if (!hasEmailConfig) {
|
||||
if (process.env.NODE_ENV !== 'test') {
|
||||
if (!CONFIG.TEST) {
|
||||
// eslint-disable-next-line no-console
|
||||
console.log('Warning: Email middleware will not try to send mails.')
|
||||
}
|
||||
@ -22,8 +22,8 @@ if (!hasEmailConfig) {
|
||||
const transporter = nodemailer.createTransport({
|
||||
host: CONFIG.SMTP_HOST,
|
||||
port: CONFIG.SMTP_PORT,
|
||||
ignoreTLS: CONFIG.SMTP_IGNORE_TLS === 'true',
|
||||
secure: false, // true for 465, false for other ports
|
||||
ignoreTLS: CONFIG.SMTP_IGNORE_TLS,
|
||||
secure: CONFIG.SMTP_SECURE, // true for 465, false for other ports
|
||||
auth: hasAuthData && {
|
||||
user: CONFIG.SMTP_USERNAME,
|
||||
pass: CONFIG.SMTP_PASSWORD,
|
||||
@ -43,9 +43,14 @@ if (!hasEmailConfig) {
|
||||
}
|
||||
|
||||
const sendSignupMail = async (resolve, root, args, context, resolveInfo) => {
|
||||
const { inviteCode } = args
|
||||
const response = await resolve(root, args, context, resolveInfo)
|
||||
const { email, nonce } = response
|
||||
await sendMail(signupTemplate({ email, nonce }))
|
||||
if (inviteCode) {
|
||||
await sendMail(signupTemplate({ email, nonce, inviteCode }))
|
||||
} else {
|
||||
await sendMail(signupTemplate({ email, nonce }))
|
||||
}
|
||||
delete response.nonce
|
||||
return response
|
||||
}
|
||||
@ -71,6 +76,5 @@ export default {
|
||||
AddEmailAddress: sendEmailVerificationMail,
|
||||
requestPasswordReset: sendPasswordResetMail,
|
||||
Signup: sendSignupMail,
|
||||
SignupByInvitation: sendSignupMail,
|
||||
},
|
||||
}
|
||||
|
||||
@ -1,10 +1,11 @@
|
||||
import mustache from 'mustache'
|
||||
import CONFIG from '../../config'
|
||||
import logosWebapp from '../../config/logos.js'
|
||||
|
||||
import * as templates from './templates'
|
||||
|
||||
const from = CONFIG.EMAIL_DEFAULT_SENDER
|
||||
const welcomeImageUrl = new URL(`/img/custom/welcome.svg`, CONFIG.CLIENT_URI)
|
||||
const welcomeImageUrl = new URL(logosWebapp.LOGO_WELCOME_PATH, CONFIG.CLIENT_URI)
|
||||
|
||||
const defaultParams = {
|
||||
supportUrl: CONFIG.SUPPORT_URL,
|
||||
@ -13,11 +14,18 @@ const defaultParams = {
|
||||
welcomeImageUrl,
|
||||
}
|
||||
|
||||
export const signupTemplate = ({ email, nonce }) => {
|
||||
export const signupTemplate = ({ email, nonce, inviteCode = null }) => {
|
||||
const subject = `Willkommen, Bienvenue, Welcome to ${CONFIG.APPLICATION_NAME}!`
|
||||
const actionUrl = new URL('/registration/create-user-account', CONFIG.CLIENT_URI)
|
||||
actionUrl.searchParams.set('nonce', nonce)
|
||||
// dev format example: http://localhost:3000/registration?method=invite-mail&email=wolle.huss%40pjannto.com&nonce=64853
|
||||
const actionUrl = new URL('/registration', CONFIG.CLIENT_URI)
|
||||
actionUrl.searchParams.set('email', email)
|
||||
actionUrl.searchParams.set('nonce', nonce)
|
||||
if (inviteCode) {
|
||||
actionUrl.searchParams.set('inviteCode', inviteCode)
|
||||
actionUrl.searchParams.set('method', 'invite-code')
|
||||
} else {
|
||||
actionUrl.searchParams.set('method', 'invite-mail')
|
||||
}
|
||||
|
||||
return {
|
||||
from,
|
||||
@ -34,8 +42,8 @@ export const signupTemplate = ({ email, nonce }) => {
|
||||
export const emailVerificationTemplate = ({ email, nonce, name }) => {
|
||||
const subject = 'Neue E-Mail Adresse | New E-Mail Address'
|
||||
const actionUrl = new URL('/settings/my-email-address/verify', CONFIG.CLIENT_URI)
|
||||
actionUrl.searchParams.set('nonce', nonce)
|
||||
actionUrl.searchParams.set('email', email)
|
||||
actionUrl.searchParams.set('nonce', nonce)
|
||||
|
||||
return {
|
||||
from,
|
||||
@ -77,7 +85,7 @@ export const wrongAccountTemplate = ({ email }) => {
|
||||
subject,
|
||||
html: mustache.render(
|
||||
templates.layout,
|
||||
{ actionUrl, supportUrl: CONFIG.SUPPORT_URL, welcomeImageUrl },
|
||||
{ ...defaultParams, actionUrl, supportUrl: CONFIG.SUPPORT_URL, welcomeImageUrl },
|
||||
{ content: templates.wrongAccount },
|
||||
),
|
||||
}
|
||||
|
||||
@ -7,8 +7,8 @@
|
||||
<td style="background-color: #ffffff;">
|
||||
<img
|
||||
src="{{{ welcomeImageUrl }}}"
|
||||
width="600" height="" alt="Welcome image" border="0"
|
||||
style="width: 100%; max-width: 600px; height: auto; background: #ffffff; font-family: Lato, sans-serif; font-size: 16px; line-height: 15px; color: #555555; margin: auto; display: block;"
|
||||
width="300" height="" alt="Welcome image" border="0"
|
||||
style="width: 100%; max-width: 300px; height: auto; background: #ffffff; font-family: Lato, sans-serif; font-size: 16px; line-height: 15px; color: #555555; margin: auto; display: block; padding: 20px;"
|
||||
class="g-img">
|
||||
</td>
|
||||
</tr>
|
||||
@ -105,8 +105,8 @@
|
||||
<td style="background-color: #ffffff;">
|
||||
<img
|
||||
src="{{{ welcomeImageUrl }}}"
|
||||
width="600" height="" alt="Welcome image" border="0"
|
||||
style="width: 100%; max-width: 600px; height: auto; background: #ffffff; font-family: Lato, sans-serif; font-size: 16px; line-height: 15px; color: #555555; margin: auto; display: block;"
|
||||
width="300" height="" alt="Welcome image" border="0"
|
||||
style="width: 100%; max-width: 300px; height: auto; background: #ffffff; font-family: Lato, sans-serif; font-size: 16px; line-height: 15px; color: #555555; margin: auto; display: block; padding: 20px;"
|
||||
class="g-img">
|
||||
</td>
|
||||
</tr>
|
||||
|
||||
@ -7,8 +7,8 @@
|
||||
<td style="background-color: #ffffff;">
|
||||
<img
|
||||
src="{{{ welcomeImageUrl }}}"
|
||||
width="600" height="" alt="Welcome image" border="0"
|
||||
style="width: 100%; max-width: 600px; height: auto; background: #ffffff; font-family: Lato, sans-serif; font-size: 16px; line-height: 15px; color: #555555; margin: auto; display: block;"
|
||||
width="300" height="" alt="Welcome image" border="0"
|
||||
style="width: 100%; max-width: 300px; height: auto; background: #ffffff; font-family: Lato, sans-serif; font-size: 16px; line-height: 15px; color: #555555; margin: auto; display: block; padding: 20px;"
|
||||
class="g-img">
|
||||
</td>
|
||||
</tr>
|
||||
@ -105,8 +105,8 @@
|
||||
<td style="background-color: #ffffff;">
|
||||
<img
|
||||
src="{{{ welcomeImageUrl }}}"
|
||||
width="600" height="" alt="Welcome image" border="0"
|
||||
style="width: 100%; max-width: 600px; height: auto; background: #ffffff; font-family: Lato, sans-serif; font-size: 16px; line-height: 15px; color: #555555; margin: auto; display: block;"
|
||||
width="300" height="" alt="Welcome image" border="0"
|
||||
style="width: 100%; max-width: 300px; height: auto; background: #ffffff; font-family: Lato, sans-serif; font-size: 16px; line-height: 15px; color: #555555; margin: auto; display: block; padding: 20px;"
|
||||
class="g-img">
|
||||
</td>
|
||||
</tr>
|
||||
|
||||
@ -7,8 +7,8 @@
|
||||
<td style="background-color: #ffffff;">
|
||||
<img
|
||||
src="{{{ welcomeImageUrl }}}"
|
||||
width="600" height="" alt="Welcome image" border="0"
|
||||
style="width: 100%; max-width: 600px; height: auto; background: #ffffff; font-family: Lato, sans-serif; font-size: 16px; line-height: 15px; color: #555555; margin: auto; display: block;"
|
||||
width="300" height="" alt="Welcome image" border="0"
|
||||
style="width: 100%; max-width: 300px; height: auto; background: #ffffff; font-family: Lato, sans-serif; font-size: 16px; line-height: 15px; color: #555555; margin: auto; display: block; padding: 20px;"
|
||||
class="g-img">
|
||||
</td>
|
||||
</tr>
|
||||
@ -118,8 +118,8 @@
|
||||
<td style="background-color: #ffffff;">
|
||||
<img
|
||||
src="{{{ welcomeImageUrl }}}"
|
||||
width="600" height="" alt="Welcome image" border="0"
|
||||
style="width: 100%; max-width: 600px; height: auto; background: #ffffff; font-family: Lato, sans-serif; font-size: 16px; line-height: 15px; color: #555555; margin: auto; display: block;"
|
||||
width="300" height="" alt="Welcome image" border="0"
|
||||
style="width: 100%; max-width: 300px; height: auto; background: #ffffff; font-family: Lato, sans-serif; font-size: 16px; line-height: 15px; color: #555555; margin: auto; display: block; padding: 20px;"
|
||||
class="g-img">
|
||||
</td>
|
||||
</tr>
|
||||
|
||||
@ -7,8 +7,8 @@
|
||||
<td style="background-color: #ffffff;">
|
||||
<img
|
||||
src="{{{ welcomeImageUrl }}}"
|
||||
width="600" height="" alt="Welcome image" border="0"
|
||||
style="width: 100%; max-width: 600px; height: auto; background: #ffffff; font-family: Lato, sans-serif; font-size: 16px; line-height: 15px; color: #555555; margin: auto; display: block;"
|
||||
width="300" height="" alt="Welcome image" border="0"
|
||||
style="width: 100%; max-width: 300px; height: auto; background: #ffffff; font-family: Lato, sans-serif; font-size: 16px; line-height: 15px; color: #555555; margin: auto; display: block; padding: 20px;"
|
||||
class="g-img">
|
||||
</td>
|
||||
</tr>
|
||||
@ -105,8 +105,8 @@
|
||||
<td style="background-color: #ffffff;">
|
||||
<img
|
||||
src="{{{ welcomeImageUrl }}}"
|
||||
width="600" height="" alt="Welcome image" border="0"
|
||||
style="width: 100%; max-width: 600px; height: auto; background: #ffffff; font-family: Lato, sans-serif; font-size: 16px; line-height: 15px; color: #555555; margin: auto; display: block;"
|
||||
width="300" height="" alt="Welcome image" border="0"
|
||||
style="width: 100%; max-width: 300px; height: auto; background: #ffffff; font-family: Lato, sans-serif; font-size: 16px; line-height: 15px; color: #555555; margin: auto; display: block; padding: 20px;"
|
||||
class="g-img">
|
||||
</td>
|
||||
</tr>
|
||||
|
||||
@ -13,7 +13,7 @@ const driver = getDriver()
|
||||
const neode = getNeode()
|
||||
const categoryIds = ['cat9']
|
||||
const createPostMutation = gql`
|
||||
mutation($id: ID, $title: String!, $postContent: String!, $categoryIds: [ID]!) {
|
||||
mutation ($id: ID, $title: String!, $postContent: String!, $categoryIds: [ID]!) {
|
||||
CreatePost(id: $id, title: $title, content: $postContent, categoryIds: $categoryIds) {
|
||||
id
|
||||
title
|
||||
@ -22,7 +22,7 @@ const createPostMutation = gql`
|
||||
}
|
||||
`
|
||||
const updatePostMutation = gql`
|
||||
mutation($id: ID!, $title: String!, $postContent: String!, $categoryIds: [ID]!) {
|
||||
mutation ($id: ID!, $title: String!, $postContent: String!, $categoryIds: [ID]!) {
|
||||
UpdatePost(id: $id, content: $postContent, title: $title, categoryIds: $categoryIds) {
|
||||
title
|
||||
content
|
||||
@ -95,7 +95,7 @@ describe('hashtags', () => {
|
||||
</p>
|
||||
`
|
||||
const postWithHastagsQuery = gql`
|
||||
query($id: ID) {
|
||||
query ($id: ID) {
|
||||
Post(id: $id) {
|
||||
tags {
|
||||
id
|
||||
|
||||
@ -14,6 +14,8 @@ import notifications from './notifications/notificationsMiddleware'
|
||||
import hashtags from './hashtags/hashtagsMiddleware'
|
||||
import email from './email/emailMiddleware'
|
||||
import sentry from './sentryMiddleware'
|
||||
import languages from './languages/languages'
|
||||
import userInteractions from './userInteractions'
|
||||
|
||||
export default (schema) => {
|
||||
const middlewares = {
|
||||
@ -30,6 +32,8 @@ export default (schema) => {
|
||||
softDelete,
|
||||
includedFields,
|
||||
orderBy,
|
||||
languages,
|
||||
userInteractions,
|
||||
}
|
||||
|
||||
let order = [
|
||||
@ -38,7 +42,9 @@ export default (schema) => {
|
||||
'xss',
|
||||
// 'activityPub', disabled temporarily
|
||||
'validation',
|
||||
'userInteractions',
|
||||
'sluggify',
|
||||
'languages',
|
||||
'excerpt',
|
||||
'email',
|
||||
'notifications',
|
||||
|
||||
28
backend/src/middleware/languages/languages.js
Normal file
28
backend/src/middleware/languages/languages.js
Normal file
@ -0,0 +1,28 @@
|
||||
import LanguageDetect from 'languagedetect'
|
||||
import sanitizeHtml from 'sanitize-html'
|
||||
|
||||
const removeHtmlTags = (input) => {
|
||||
return sanitizeHtml(input, {
|
||||
allowedTags: [],
|
||||
allowedAttributes: {},
|
||||
})
|
||||
}
|
||||
|
||||
const setPostLanguage = (text) => {
|
||||
const lngDetector = new LanguageDetect()
|
||||
lngDetector.setLanguageType('iso2')
|
||||
return lngDetector.detect(removeHtmlTags(text), 1)[0][0]
|
||||
}
|
||||
|
||||
export default {
|
||||
Mutation: {
|
||||
CreatePost: async (resolve, root, args, context, info) => {
|
||||
args.language = await setPostLanguage(args.content)
|
||||
return resolve(root, args, context, info)
|
||||
},
|
||||
UpdatePost: async (resolve, root, args, context, info) => {
|
||||
args.language = await setPostLanguage(args.content)
|
||||
return resolve(root, args, context, info)
|
||||
},
|
||||
},
|
||||
}
|
||||
132
backend/src/middleware/languages/languages.spec.js
Normal file
132
backend/src/middleware/languages/languages.spec.js
Normal file
@ -0,0 +1,132 @@
|
||||
import Factory, { cleanDatabase } from '../../db/factories'
|
||||
import { gql } from '../../helpers/jest'
|
||||
import { getNeode, getDriver } from '../../db/neo4j'
|
||||
import createServer from '../../server'
|
||||
import { createTestClient } from 'apollo-server-testing'
|
||||
|
||||
let mutate
|
||||
let authenticatedUser
|
||||
let variables
|
||||
|
||||
const driver = getDriver()
|
||||
const neode = getNeode()
|
||||
|
||||
beforeAll(async () => {
|
||||
const { server } = createServer({
|
||||
context: () => {
|
||||
return {
|
||||
driver,
|
||||
neode,
|
||||
user: authenticatedUser,
|
||||
}
|
||||
},
|
||||
})
|
||||
mutate = createTestClient(server).mutate
|
||||
})
|
||||
|
||||
afterAll(async () => {
|
||||
await cleanDatabase()
|
||||
})
|
||||
|
||||
const createPostMutation = gql`
|
||||
mutation ($title: String!, $content: String!, $categoryIds: [ID]) {
|
||||
CreatePost(title: $title, content: $content, categoryIds: $categoryIds) {
|
||||
language
|
||||
}
|
||||
}
|
||||
`
|
||||
|
||||
describe('languagesMiddleware', () => {
|
||||
variables = {
|
||||
title: 'Test post languages',
|
||||
categoryIds: ['cat9'],
|
||||
}
|
||||
|
||||
beforeAll(async () => {
|
||||
await cleanDatabase()
|
||||
const user = await Factory.build('user')
|
||||
authenticatedUser = await user.toJson()
|
||||
await Factory.build('category', {
|
||||
id: 'cat9',
|
||||
name: 'Democracy & Politics',
|
||||
icon: 'university',
|
||||
})
|
||||
})
|
||||
|
||||
it('detects German', async () => {
|
||||
variables = {
|
||||
...variables,
|
||||
content: 'Jeder sollte vor seiner eigenen Tür kehren.',
|
||||
}
|
||||
await expect(
|
||||
mutate({
|
||||
mutation: createPostMutation,
|
||||
variables,
|
||||
}),
|
||||
).resolves.toMatchObject({
|
||||
data: {
|
||||
CreatePost: {
|
||||
language: 'de',
|
||||
},
|
||||
},
|
||||
})
|
||||
})
|
||||
|
||||
it('detects English', async () => {
|
||||
variables = {
|
||||
...variables,
|
||||
content: 'A journey of a thousand miles begins with a single step.',
|
||||
}
|
||||
await expect(
|
||||
mutate({
|
||||
mutation: createPostMutation,
|
||||
variables,
|
||||
}),
|
||||
).resolves.toMatchObject({
|
||||
data: {
|
||||
CreatePost: {
|
||||
language: 'en',
|
||||
},
|
||||
},
|
||||
})
|
||||
})
|
||||
|
||||
it('detects Spanish', async () => {
|
||||
variables = {
|
||||
...variables,
|
||||
content: 'A caballo regalado, no le mires el diente.',
|
||||
}
|
||||
await expect(
|
||||
mutate({
|
||||
mutation: createPostMutation,
|
||||
variables,
|
||||
}),
|
||||
).resolves.toMatchObject({
|
||||
data: {
|
||||
CreatePost: {
|
||||
language: 'es',
|
||||
},
|
||||
},
|
||||
})
|
||||
})
|
||||
|
||||
it('detects German in between lots of html tags', async () => {
|
||||
variables = {
|
||||
...variables,
|
||||
content:
|
||||
'<strong>Jeder</strong> <strike>sollte</strike> <strong>vor</strong> <span>seiner</span> eigenen <blockquote>Tür</blockquote> kehren.',
|
||||
}
|
||||
await expect(
|
||||
mutate({
|
||||
mutation: createPostMutation,
|
||||
variables,
|
||||
}),
|
||||
).resolves.toMatchObject({
|
||||
data: {
|
||||
CreatePost: {
|
||||
language: 'de',
|
||||
},
|
||||
},
|
||||
})
|
||||
})
|
||||
})
|
||||
@ -10,7 +10,7 @@ const driver = getDriver()
|
||||
const neode = getNeode()
|
||||
const categoryIds = ['cat9']
|
||||
const createPostMutation = gql`
|
||||
mutation($id: ID, $title: String!, $postContent: String!, $categoryIds: [ID]!) {
|
||||
mutation ($id: ID, $title: String!, $postContent: String!, $categoryIds: [ID]!) {
|
||||
CreatePost(id: $id, title: $title, content: $postContent, categoryIds: $categoryIds) {
|
||||
id
|
||||
title
|
||||
@ -19,7 +19,7 @@ const createPostMutation = gql`
|
||||
}
|
||||
`
|
||||
const updatePostMutation = gql`
|
||||
mutation($id: ID!, $title: String!, $postContent: String!, $categoryIds: [ID]!) {
|
||||
mutation ($id: ID!, $title: String!, $postContent: String!, $categoryIds: [ID]!) {
|
||||
UpdatePost(id: $id, content: $postContent, title: $title, categoryIds: $categoryIds) {
|
||||
title
|
||||
content
|
||||
@ -27,7 +27,7 @@ const updatePostMutation = gql`
|
||||
}
|
||||
`
|
||||
const createCommentMutation = gql`
|
||||
mutation($id: ID, $postId: ID!, $commentContent: String!) {
|
||||
mutation ($id: ID, $postId: ID!, $commentContent: String!) {
|
||||
CreateComment(id: $id, postId: $postId, content: $commentContent) {
|
||||
id
|
||||
content
|
||||
@ -80,7 +80,7 @@ afterEach(async () => {
|
||||
|
||||
describe('notifications', () => {
|
||||
const notificationQuery = gql`
|
||||
query($read: Boolean) {
|
||||
query ($read: Boolean) {
|
||||
notifications(read: $read, orderBy: updatedAt_desc) {
|
||||
read
|
||||
reason
|
||||
@ -367,7 +367,7 @@ describe('notifications', () => {
|
||||
describe('if the notification was marked as read earlier', () => {
|
||||
const markAsReadAction = async () => {
|
||||
const mutation = gql`
|
||||
mutation($id: ID!) {
|
||||
mutation ($id: ID!) {
|
||||
markAsRead(id: $id) {
|
||||
read
|
||||
}
|
||||
|
||||
@ -1,6 +1,7 @@
|
||||
import { rule, shield, deny, allow, or } from 'graphql-shield'
|
||||
import { getNeode } from '../db/neo4j'
|
||||
import CONFIG from '../config'
|
||||
import { validateInviteCode } from '../schema/resolvers/transactions/inviteCodes'
|
||||
|
||||
const debug = !!CONFIG.DEBUG
|
||||
const allowExternalErrors = true
|
||||
@ -29,15 +30,25 @@ const onlyYourself = rule({
|
||||
|
||||
const isMyOwn = rule({
|
||||
cache: 'no_cache',
|
||||
})(async (parent, args, context, info) => {
|
||||
return context.user.id === parent.id
|
||||
})(async (parent, args, { user }, info) => {
|
||||
return user && user.id === parent.id
|
||||
})
|
||||
|
||||
const isMySocialMedia = rule({
|
||||
cache: 'no_cache',
|
||||
})(async (_, args, { user }) => {
|
||||
// We need a User
|
||||
if (!user) {
|
||||
return false
|
||||
}
|
||||
let socialMedia = await neode.find('SocialMedia', args.id)
|
||||
socialMedia = await socialMedia.toJson()
|
||||
// Did we find a social media node?
|
||||
if (!socialMedia) {
|
||||
return false
|
||||
}
|
||||
socialMedia = await socialMedia.toJson() // whats this for?
|
||||
|
||||
// Is it my social media entry?
|
||||
return socialMedia.ownedBy.node.id === user.id
|
||||
})
|
||||
|
||||
@ -77,7 +88,14 @@ const noEmailFilter = rule({
|
||||
return !('email' in args)
|
||||
})
|
||||
|
||||
const publicRegistration = rule()(() => !!CONFIG.PUBLIC_REGISTRATION)
|
||||
const publicRegistration = rule()(() => CONFIG.PUBLIC_REGISTRATION)
|
||||
|
||||
const inviteRegistration = rule()(async (_parent, args, { user, driver }) => {
|
||||
if (!CONFIG.INVITE_REGISTRATION) return false
|
||||
const { inviteCode } = args
|
||||
const session = driver.session()
|
||||
return validateInviteCode(session, inviteCode)
|
||||
})
|
||||
|
||||
// Permissions
|
||||
export default shield(
|
||||
@ -86,7 +104,10 @@ export default shield(
|
||||
'*': deny,
|
||||
findPosts: allow,
|
||||
findUsers: allow,
|
||||
findResources: allow,
|
||||
searchResults: allow,
|
||||
searchPosts: allow,
|
||||
searchUsers: allow,
|
||||
searchHashtags: allow,
|
||||
embed: allow,
|
||||
Category: allow,
|
||||
Tag: allow,
|
||||
@ -106,12 +127,17 @@ export default shield(
|
||||
notifications: isAuthenticated,
|
||||
Donations: isAuthenticated,
|
||||
userData: isAuthenticated,
|
||||
MyInviteCodes: isAuthenticated,
|
||||
isValidInviteCode: allow,
|
||||
VerifyNonce: allow,
|
||||
queryLocations: isAuthenticated,
|
||||
availableRoles: isAdmin,
|
||||
getInviteCode: isAuthenticated, // and inviteRegistration
|
||||
},
|
||||
Mutation: {
|
||||
'*': deny,
|
||||
login: allow,
|
||||
SignupByInvitation: allow,
|
||||
Signup: or(publicRegistration, isAdmin),
|
||||
Signup: or(publicRegistration, inviteRegistration, isAdmin),
|
||||
SignupVerification: allow,
|
||||
UpdateUser: onlyYourself,
|
||||
CreatePost: isAuthenticated,
|
||||
@ -149,6 +175,9 @@ export default shield(
|
||||
pinPost: isAdmin,
|
||||
unpinPost: isAdmin,
|
||||
UpdateDonations: isAdmin,
|
||||
GenerateInviteCode: isAuthenticated,
|
||||
switchUserRole: isAdmin,
|
||||
markTeaserAsViewed: allow,
|
||||
},
|
||||
User: {
|
||||
email: or(isMyOwn, isAdmin),
|
||||
|
||||
@ -3,11 +3,13 @@ import createServer from '../server'
|
||||
import Factory, { cleanDatabase } from '../db/factories'
|
||||
import { gql } from '../helpers/jest'
|
||||
import { getDriver, getNeode } from '../db/neo4j'
|
||||
import CONFIG from '../config'
|
||||
|
||||
const instance = getNeode()
|
||||
const driver = getDriver()
|
||||
|
||||
let query, authenticatedUser, owner, anotherRegularUser, administrator, variables, moderator
|
||||
let query, mutate, variables
|
||||
let authenticatedUser, owner, anotherRegularUser, administrator, moderator
|
||||
|
||||
describe('authorization', () => {
|
||||
beforeAll(async () => {
|
||||
@ -20,6 +22,7 @@ describe('authorization', () => {
|
||||
}),
|
||||
})
|
||||
query = createTestClient(server).query
|
||||
mutate = createTestClient(server).mutate
|
||||
})
|
||||
|
||||
afterEach(async () => {
|
||||
@ -77,7 +80,7 @@ describe('authorization', () => {
|
||||
|
||||
describe('access email address', () => {
|
||||
const userQuery = gql`
|
||||
query($name: String) {
|
||||
query ($name: String) {
|
||||
User(name: $name) {
|
||||
email
|
||||
}
|
||||
@ -159,5 +162,132 @@ describe('authorization', () => {
|
||||
})
|
||||
})
|
||||
})
|
||||
|
||||
describe('access Signup', () => {
|
||||
const signupMutation = gql`
|
||||
mutation ($email: String!, $inviteCode: String) {
|
||||
Signup(email: $email, inviteCode: $inviteCode) {
|
||||
email
|
||||
}
|
||||
}
|
||||
`
|
||||
|
||||
describe('admin invite only', () => {
|
||||
beforeEach(async () => {
|
||||
variables = {
|
||||
email: 'some@email.org',
|
||||
inviteCode: 'AAAAAA',
|
||||
}
|
||||
CONFIG.INVITE_REGISTRATION = false
|
||||
CONFIG.PUBLIC_REGISTRATION = false
|
||||
await Factory.build('inviteCode', {
|
||||
code: 'AAAAAA',
|
||||
})
|
||||
})
|
||||
|
||||
describe('as user', () => {
|
||||
beforeEach(async () => {
|
||||
authenticatedUser = await anotherRegularUser.toJson()
|
||||
})
|
||||
|
||||
it('denies permission', async () => {
|
||||
await expect(mutate({ mutation: signupMutation, variables })).resolves.toMatchObject({
|
||||
errors: [{ message: 'Not Authorised!' }],
|
||||
data: { Signup: null },
|
||||
})
|
||||
})
|
||||
})
|
||||
|
||||
describe('as admin', () => {
|
||||
beforeEach(async () => {
|
||||
authenticatedUser = await administrator.toJson()
|
||||
})
|
||||
|
||||
it('returns an email', async () => {
|
||||
await expect(mutate({ mutation: signupMutation, variables })).resolves.toMatchObject({
|
||||
errors: undefined,
|
||||
data: {
|
||||
Signup: { email: 'some@email.org' },
|
||||
},
|
||||
})
|
||||
})
|
||||
})
|
||||
})
|
||||
|
||||
describe('public registration', () => {
|
||||
beforeEach(async () => {
|
||||
variables = {
|
||||
email: 'some@email.org',
|
||||
inviteCode: 'AAAAAA',
|
||||
}
|
||||
CONFIG.INVITE_REGISTRATION = false
|
||||
CONFIG.PUBLIC_REGISTRATION = true
|
||||
await Factory.build('inviteCode', {
|
||||
code: 'AAAAAA',
|
||||
})
|
||||
})
|
||||
|
||||
describe('as anyone', () => {
|
||||
beforeEach(async () => {
|
||||
authenticatedUser = null
|
||||
})
|
||||
|
||||
it('returns an email', async () => {
|
||||
await expect(mutate({ mutation: signupMutation, variables })).resolves.toMatchObject({
|
||||
errors: undefined,
|
||||
data: {
|
||||
Signup: { email: 'some@email.org' },
|
||||
},
|
||||
})
|
||||
})
|
||||
})
|
||||
})
|
||||
|
||||
describe('invite registration', () => {
|
||||
beforeEach(async () => {
|
||||
CONFIG.INVITE_REGISTRATION = true
|
||||
CONFIG.PUBLIC_REGISTRATION = false
|
||||
await Factory.build('inviteCode', {
|
||||
code: 'AAAAAA',
|
||||
})
|
||||
})
|
||||
|
||||
describe('as anyone with valid invite code', () => {
|
||||
beforeEach(async () => {
|
||||
variables = {
|
||||
email: 'some@email.org',
|
||||
inviteCode: 'AAAAAA',
|
||||
}
|
||||
authenticatedUser = null
|
||||
})
|
||||
|
||||
it('returns an email', async () => {
|
||||
await expect(mutate({ mutation: signupMutation, variables })).resolves.toMatchObject({
|
||||
errors: undefined,
|
||||
data: {
|
||||
Signup: { email: 'some@email.org' },
|
||||
},
|
||||
})
|
||||
})
|
||||
})
|
||||
|
||||
describe('as anyone without valid invite', () => {
|
||||
beforeEach(async () => {
|
||||
variables = {
|
||||
email: 'some@email.org',
|
||||
inviteCode: 'no valid invite code',
|
||||
}
|
||||
authenticatedUser = null
|
||||
})
|
||||
|
||||
it('denies permission', async () => {
|
||||
await expect(mutate({ mutation: signupMutation, variables })).resolves.toMatchObject({
|
||||
errors: [{ message: 'Not Authorised!' }],
|
||||
data: { Signup: null },
|
||||
})
|
||||
})
|
||||
})
|
||||
})
|
||||
})
|
||||
})
|
||||
})
|
||||
|
||||
@ -1,16 +1,16 @@
|
||||
import { sentry } from 'graphql-middleware-sentry'
|
||||
import { sentryConfigs } from '../config'
|
||||
import CONFIG from '../config'
|
||||
|
||||
let sentryMiddleware = (resolve, root, args, context, resolveInfo) =>
|
||||
resolve(root, args, context, resolveInfo)
|
||||
|
||||
if (sentryConfigs.SENTRY_DSN_BACKEND) {
|
||||
if (CONFIG.SENTRY_DSN_BACKEND) {
|
||||
sentryMiddleware = sentry({
|
||||
forwardErrors: true,
|
||||
config: {
|
||||
dsn: sentryConfigs.SENTRY_DSN_BACKEND,
|
||||
release: sentryConfigs.COMMIT,
|
||||
environment: process.env.NODE_ENV,
|
||||
dsn: CONFIG.SENTRY_DSN_BACKEND,
|
||||
release: CONFIG.COMMIT,
|
||||
environment: CONFIG.NODE_ENV,
|
||||
},
|
||||
withScope: (scope, error, context) => {
|
||||
scope.setUser({
|
||||
@ -23,7 +23,7 @@ if (sentryConfigs.SENTRY_DSN_BACKEND) {
|
||||
})
|
||||
} else {
|
||||
// eslint-disable-next-line no-console
|
||||
if (process.env.NODE_ENV !== 'test') console.log('Warning: Sentry middleware inactive.')
|
||||
if (!CONFIG.TEST) console.log('Warning: Sentry middleware inactive.')
|
||||
}
|
||||
|
||||
export default sentryMiddleware
|
||||
|
||||
@ -2,6 +2,7 @@ import slugify from 'slug'
|
||||
export default async function uniqueSlug(string, isUnique) {
|
||||
const slug = slugify(string || 'anonymous', {
|
||||
lower: true,
|
||||
multicharmap: { Ä: 'AE', ä: 'ae', Ö: 'OE', ö: 'oe', Ü: 'UE', ü: 'ue', ß: 'ss' },
|
||||
})
|
||||
if (await isUnique(slug)) return slug
|
||||
|
||||
|
||||
@ -18,4 +18,16 @@ describe('uniqueSlug', () => {
|
||||
const isUnique = jest.fn().mockResolvedValue(true)
|
||||
expect(uniqueSlug(string, isUnique)).resolves.toEqual('anonymous')
|
||||
})
|
||||
|
||||
it('Converts umlaut to a two letter equivalent', async () => {
|
||||
const umlaut = 'ÄÖÜäöüß'
|
||||
const isUnique = jest.fn().mockResolvedValue(true)
|
||||
await expect(uniqueSlug(umlaut, isUnique)).resolves.toEqual('aeoeueaeoeuess')
|
||||
})
|
||||
|
||||
it('Removes Spanish enya and diacritics', async () => {
|
||||
const diacritics = 'áàéèíìóòúùñçÁÀÉÈÍÌÓÒÚÙÑÇ'
|
||||
const isUnique = jest.fn().mockResolvedValue(true)
|
||||
await expect(uniqueSlug(diacritics, isUnique)).resolves.toEqual('aaeeiioouuncaaeeiioouunc')
|
||||
})
|
||||
})
|
||||
|
||||
@ -11,7 +11,8 @@ let variables
|
||||
const driver = getDriver()
|
||||
const neode = getNeode()
|
||||
|
||||
beforeAll(() => {
|
||||
beforeAll(async () => {
|
||||
await cleanDatabase()
|
||||
const { server } = createServer({
|
||||
context: () => {
|
||||
return {
|
||||
@ -53,7 +54,7 @@ describe('slugifyMiddleware', () => {
|
||||
describe('CreatePost', () => {
|
||||
const categoryIds = ['cat9']
|
||||
const createPostMutation = gql`
|
||||
mutation($title: String!, $content: String!, $categoryIds: [ID]!, $slug: String) {
|
||||
mutation ($title: String!, $content: String!, $categoryIds: [ID]!, $slug: String) {
|
||||
CreatePost(title: $title, content: $content, categoryIds: $categoryIds, slug: $slug) {
|
||||
slug
|
||||
}
|
||||
@ -163,7 +164,7 @@ describe('slugifyMiddleware', () => {
|
||||
|
||||
describe('SignupVerification', () => {
|
||||
const mutation = gql`
|
||||
mutation(
|
||||
mutation (
|
||||
$password: String!
|
||||
$email: String!
|
||||
$name: String!
|
||||
|
||||
44
backend/src/middleware/userInteractions.js
Normal file
44
backend/src/middleware/userInteractions.js
Normal file
@ -0,0 +1,44 @@
|
||||
const createRelatedCypher = (relation) => `
|
||||
MATCH (user:User { id: $currentUser})
|
||||
MATCH (post:Post { id: $postId})
|
||||
OPTIONAL MATCH (post)<-[r:${relation}]-(u:User)
|
||||
WHERE NOT u.disabled AND NOT u.deleted
|
||||
WITH user, post, count(DISTINCT u) AS count
|
||||
MERGE (user)-[relation:${relation} { }]->(post)
|
||||
ON CREATE
|
||||
SET relation.count = 1,
|
||||
relation.createdAt = toString(datetime()),
|
||||
post.clickedCount = count + 1
|
||||
ON MATCH
|
||||
SET relation.count = relation.count + 1,
|
||||
relation.updatedAt = toString(datetime()),
|
||||
post.clickedCount = count
|
||||
RETURN user, post, relation
|
||||
`
|
||||
|
||||
const setPostCounter = async (postId, relation, context) => {
|
||||
const {
|
||||
user: { id: currentUser },
|
||||
} = context
|
||||
const session = context.driver.session()
|
||||
try {
|
||||
await session.writeTransaction((txc) => {
|
||||
return txc.run(createRelatedCypher(relation), { currentUser, postId })
|
||||
})
|
||||
} finally {
|
||||
session.close()
|
||||
}
|
||||
}
|
||||
|
||||
const userClickedPost = async (resolve, root, args, context, info) => {
|
||||
if (args.id) {
|
||||
await setPostCounter(args.id, 'CLICKED', context)
|
||||
}
|
||||
return resolve(root, args, context, info)
|
||||
}
|
||||
|
||||
export default {
|
||||
Query: {
|
||||
Post: userClickedPost,
|
||||
},
|
||||
}
|
||||
98
backend/src/middleware/userInteractions.spec.js
Normal file
98
backend/src/middleware/userInteractions.spec.js
Normal file
@ -0,0 +1,98 @@
|
||||
import Factory, { cleanDatabase } from '../db/factories'
|
||||
import { gql } from '../helpers/jest'
|
||||
import { getNeode, getDriver } from '../db/neo4j'
|
||||
import createServer from '../server'
|
||||
import { createTestClient } from 'apollo-server-testing'
|
||||
|
||||
let query, aUser, bUser, post, authenticatedUser, variables
|
||||
|
||||
const driver = getDriver()
|
||||
const neode = getNeode()
|
||||
|
||||
const postQuery = gql`
|
||||
query ($id: ID) {
|
||||
Post(id: $id) {
|
||||
clickedCount
|
||||
}
|
||||
}
|
||||
`
|
||||
|
||||
beforeAll(async () => {
|
||||
await cleanDatabase()
|
||||
aUser = await Factory.build('user', {
|
||||
id: 'a-user',
|
||||
})
|
||||
bUser = await Factory.build('user', {
|
||||
id: 'b-user',
|
||||
})
|
||||
post = await Factory.build('post')
|
||||
authenticatedUser = await aUser.toJson()
|
||||
const { server } = createServer({
|
||||
context: () => {
|
||||
return {
|
||||
driver,
|
||||
neode,
|
||||
user: authenticatedUser,
|
||||
}
|
||||
},
|
||||
})
|
||||
query = createTestClient(server).query
|
||||
})
|
||||
|
||||
afterAll(async () => {
|
||||
await cleanDatabase()
|
||||
})
|
||||
|
||||
describe('middleware/userInteractions', () => {
|
||||
describe('given one post', () => {
|
||||
it('does not change clickedCount when queried without ID', async () => {
|
||||
await expect(query({ query: postQuery, variables })).resolves.toMatchObject({
|
||||
data: {
|
||||
Post: expect.arrayContaining([
|
||||
{
|
||||
clickedCount: 0,
|
||||
},
|
||||
]),
|
||||
},
|
||||
})
|
||||
})
|
||||
|
||||
it('changes clickedCount when queried with ID', async () => {
|
||||
variables = { id: post.get('id') }
|
||||
await expect(query({ query: postQuery, variables })).resolves.toMatchObject({
|
||||
data: {
|
||||
Post: expect.arrayContaining([
|
||||
{
|
||||
clickedCount: 1,
|
||||
},
|
||||
]),
|
||||
},
|
||||
})
|
||||
})
|
||||
|
||||
it('does not change clickedCount when same user queries the post again', async () => {
|
||||
await expect(query({ query: postQuery, variables })).resolves.toMatchObject({
|
||||
data: {
|
||||
Post: expect.arrayContaining([
|
||||
{
|
||||
clickedCount: 1,
|
||||
},
|
||||
]),
|
||||
},
|
||||
})
|
||||
})
|
||||
|
||||
it('changes clickedCount when another user queries the post', async () => {
|
||||
authenticatedUser = await bUser.toJson()
|
||||
await expect(query({ query: postQuery, variables })).resolves.toMatchObject({
|
||||
data: {
|
||||
Post: expect.arrayContaining([
|
||||
{
|
||||
clickedCount: 2,
|
||||
},
|
||||
]),
|
||||
},
|
||||
})
|
||||
})
|
||||
})
|
||||
})
|
||||
@ -2,8 +2,6 @@ import { UserInputError } from 'apollo-server'
|
||||
|
||||
const COMMENT_MIN_LENGTH = 1
|
||||
const NO_POST_ERR_MESSAGE = 'Comment cannot be created without a post!'
|
||||
const NO_CATEGORIES_ERR_MESSAGE =
|
||||
'You cannot save a post without at least one category or more than three'
|
||||
const USERNAME_MIN_LENGTH = 3
|
||||
const validateCreateComment = async (resolve, root, args, context, info) => {
|
||||
const content = args.content.replace(/<(?:.|\n)*?>/gm, '').trim()
|
||||
@ -46,20 +44,6 @@ const validateUpdateComment = async (resolve, root, args, context, info) => {
|
||||
return resolve(root, args, context, info)
|
||||
}
|
||||
|
||||
const validatePost = async (resolve, root, args, context, info) => {
|
||||
const { categoryIds } = args
|
||||
if (!Array.isArray(categoryIds) || !categoryIds.length || categoryIds.length > 3) {
|
||||
throw new UserInputError(NO_CATEGORIES_ERR_MESSAGE)
|
||||
}
|
||||
return resolve(root, args, context, info)
|
||||
}
|
||||
|
||||
const validateUpdatePost = async (resolve, root, args, context, info) => {
|
||||
const { categoryIds } = args
|
||||
if (typeof categoryIds === 'undefined') return resolve(root, args, context, info)
|
||||
return validatePost(resolve, root, args, context, info)
|
||||
}
|
||||
|
||||
const validateReport = async (resolve, root, args, context, info) => {
|
||||
const { resourceId } = args
|
||||
const { user } = context
|
||||
@ -138,8 +122,6 @@ export default {
|
||||
Mutation: {
|
||||
CreateComment: validateCreateComment,
|
||||
UpdateComment: validateUpdateComment,
|
||||
CreatePost: validatePost,
|
||||
UpdatePost: validateUpdatePost,
|
||||
UpdateUser: validateUpdateUser,
|
||||
fileReport: validateReport,
|
||||
review: validateReview,
|
||||
|
||||
@ -17,42 +17,22 @@ let authenticatedUser,
|
||||
commentingUser
|
||||
|
||||
const createCommentMutation = gql`
|
||||
mutation($id: ID, $postId: ID!, $content: String!) {
|
||||
mutation ($id: ID, $postId: ID!, $content: String!) {
|
||||
CreateComment(id: $id, postId: $postId, content: $content) {
|
||||
id
|
||||
}
|
||||
}
|
||||
`
|
||||
const updateCommentMutation = gql`
|
||||
mutation($content: String!, $id: ID!) {
|
||||
mutation ($content: String!, $id: ID!) {
|
||||
UpdateComment(content: $content, id: $id) {
|
||||
id
|
||||
}
|
||||
}
|
||||
`
|
||||
const createPostMutation = gql`
|
||||
mutation($id: ID, $title: String!, $content: String!, $language: String, $categoryIds: [ID]) {
|
||||
CreatePost(
|
||||
id: $id
|
||||
title: $title
|
||||
content: $content
|
||||
language: $language
|
||||
categoryIds: $categoryIds
|
||||
) {
|
||||
id
|
||||
}
|
||||
}
|
||||
`
|
||||
|
||||
const updatePostMutation = gql`
|
||||
mutation($id: ID!, $title: String!, $content: String!, $categoryIds: [ID]) {
|
||||
UpdatePost(id: $id, title: $title, content: $content, categoryIds: $categoryIds) {
|
||||
id
|
||||
}
|
||||
}
|
||||
`
|
||||
const reportMutation = gql`
|
||||
mutation($resourceId: ID!, $reasonCategory: ReasonCategory!, $reasonDescription: String!) {
|
||||
mutation ($resourceId: ID!, $reasonCategory: ReasonCategory!, $reasonDescription: String!) {
|
||||
fileReport(
|
||||
resourceId: $resourceId
|
||||
reasonCategory: $reasonCategory
|
||||
@ -63,7 +43,7 @@ const reportMutation = gql`
|
||||
}
|
||||
`
|
||||
const reviewMutation = gql`
|
||||
mutation($resourceId: ID!, $disable: Boolean, $closed: Boolean) {
|
||||
mutation ($resourceId: ID!, $disable: Boolean, $closed: Boolean) {
|
||||
review(resourceId: $resourceId, disable: $disable, closed: $closed) {
|
||||
createdAt
|
||||
updatedAt
|
||||
@ -72,7 +52,7 @@ const reviewMutation = gql`
|
||||
`
|
||||
|
||||
const updateUserMutation = gql`
|
||||
mutation($id: ID!, $name: String) {
|
||||
mutation ($id: ID!, $name: String) {
|
||||
UpdateUser(id: $id, name: $name) {
|
||||
name
|
||||
}
|
||||
@ -227,104 +207,6 @@ describe('validateCreateComment', () => {
|
||||
})
|
||||
})
|
||||
})
|
||||
|
||||
describe('validatePost', () => {
|
||||
let createPostVariables
|
||||
beforeEach(async () => {
|
||||
createPostVariables = {
|
||||
title: 'I am a title',
|
||||
content: 'Some content',
|
||||
}
|
||||
authenticatedUser = await commentingUser.toJson()
|
||||
})
|
||||
|
||||
describe('categories', () => {
|
||||
describe('null', () => {
|
||||
it('throws UserInputError', async () => {
|
||||
createPostVariables = { ...createPostVariables, categoryIds: null }
|
||||
await expect(
|
||||
mutate({ mutation: createPostMutation, variables: createPostVariables }),
|
||||
).resolves.toMatchObject({
|
||||
data: { CreatePost: null },
|
||||
errors: [
|
||||
{
|
||||
message: 'You cannot save a post without at least one category or more than three',
|
||||
},
|
||||
],
|
||||
})
|
||||
})
|
||||
})
|
||||
|
||||
describe('empty', () => {
|
||||
it('throws UserInputError', async () => {
|
||||
createPostVariables = { ...createPostVariables, categoryIds: [] }
|
||||
await expect(
|
||||
mutate({ mutation: createPostMutation, variables: createPostVariables }),
|
||||
).resolves.toMatchObject({
|
||||
data: { CreatePost: null },
|
||||
errors: [
|
||||
{
|
||||
message: 'You cannot save a post without at least one category or more than three',
|
||||
},
|
||||
],
|
||||
})
|
||||
})
|
||||
})
|
||||
|
||||
describe('more than 3 categoryIds', () => {
|
||||
it('throws UserInputError', async () => {
|
||||
createPostVariables = {
|
||||
...createPostVariables,
|
||||
categoryIds: ['cat9', 'cat27', 'cat15', 'cat4'],
|
||||
}
|
||||
await expect(
|
||||
mutate({ mutation: createPostMutation, variables: createPostVariables }),
|
||||
).resolves.toMatchObject({
|
||||
data: { CreatePost: null },
|
||||
errors: [
|
||||
{
|
||||
message: 'You cannot save a post without at least one category or more than three',
|
||||
},
|
||||
],
|
||||
})
|
||||
})
|
||||
})
|
||||
})
|
||||
})
|
||||
|
||||
describe('validateUpdatePost', () => {
|
||||
describe('post created without categories somehow', () => {
|
||||
let owner, updatePostVariables
|
||||
beforeEach(async () => {
|
||||
const postSomehowCreated = await neode.create('Post', {
|
||||
id: 'how-was-this-created',
|
||||
})
|
||||
owner = await neode.create('User', {
|
||||
id: 'author-of-post-without-category',
|
||||
slug: 'hacker',
|
||||
})
|
||||
await postSomehowCreated.relateTo(owner, 'author')
|
||||
authenticatedUser = await owner.toJson()
|
||||
updatePostVariables = {
|
||||
id: 'how-was-this-created',
|
||||
title: 'I am a title',
|
||||
content: 'Some content',
|
||||
categoryIds: [],
|
||||
}
|
||||
})
|
||||
|
||||
it('requires at least one category for successful update', async () => {
|
||||
await expect(
|
||||
mutate({ mutation: updatePostMutation, variables: updatePostVariables }),
|
||||
).resolves.toMatchObject({
|
||||
data: { UpdatePost: null },
|
||||
errors: [
|
||||
{ message: 'You cannot save a post without at least one category or more than three' },
|
||||
],
|
||||
})
|
||||
})
|
||||
})
|
||||
})
|
||||
})
|
||||
|
||||
describe('validateReport', () => {
|
||||
|
||||
@ -3,5 +3,6 @@ export default {
|
||||
alt: { type: 'string' },
|
||||
sensitive: { type: 'boolean', default: false },
|
||||
aspectRatio: { type: 'float', default: 1.0 },
|
||||
type: { type: 'string' },
|
||||
createdAt: { type: 'string', isoDate: true, default: () => new Date().toISOString() },
|
||||
}
|
||||
|
||||
@ -1,16 +1,17 @@
|
||||
export default {
|
||||
code: { type: 'string', primary: true },
|
||||
createdAt: { type: 'string', isoDate: true, default: () => new Date().toISOString() },
|
||||
token: { type: 'string', primary: true, token: true },
|
||||
generatedBy: {
|
||||
expiresAt: { type: 'string', isoDate: true, default: null },
|
||||
generated: {
|
||||
type: 'relationship',
|
||||
relationship: 'GENERATED',
|
||||
target: 'User',
|
||||
direction: 'in',
|
||||
},
|
||||
activated: {
|
||||
redeemed: {
|
||||
type: 'relationship',
|
||||
relationship: 'ACTIVATED',
|
||||
target: 'EmailAddress',
|
||||
direction: 'out',
|
||||
relationship: 'REDEEMED',
|
||||
target: 'User',
|
||||
direction: 'in',
|
||||
},
|
||||
}
|
||||
@ -22,6 +22,8 @@ export default {
|
||||
contentExcerpt: { type: 'string', allow: [null] },
|
||||
deleted: { type: 'boolean', default: false },
|
||||
disabled: { type: 'boolean', default: false },
|
||||
clickedCount: { type: 'int', default: 0 },
|
||||
viewedTeaserCount: { type: 'int', default: 0 },
|
||||
notified: {
|
||||
type: 'relationship',
|
||||
relationship: 'NOTIFIED',
|
||||
|
||||
@ -100,6 +100,18 @@ export default {
|
||||
target: 'User',
|
||||
direction: 'in',
|
||||
},
|
||||
inviteCodes: {
|
||||
type: 'relationship',
|
||||
relationship: 'GENERATED',
|
||||
target: 'InviteCode',
|
||||
direction: 'out',
|
||||
},
|
||||
redeemedInviteCode: {
|
||||
type: 'relationship',
|
||||
relationship: 'REDEEMED',
|
||||
target: 'InviteCode',
|
||||
direction: 'out',
|
||||
},
|
||||
termsAndConditionsAgreedVersion: {
|
||||
type: 'string',
|
||||
allow: [null],
|
||||
|
||||
@ -15,4 +15,5 @@ export default {
|
||||
Donations: require('./Donations.js').default,
|
||||
Report: require('./Report.js').default,
|
||||
Migration: require('./Migration.js').default,
|
||||
InviteCode: require('./InviteCode.js').default,
|
||||
}
|
||||
|
||||
@ -36,7 +36,7 @@ afterEach(async () => {
|
||||
})
|
||||
|
||||
const createCommentMutation = gql`
|
||||
mutation($id: ID, $postId: ID!, $content: String!) {
|
||||
mutation ($id: ID, $postId: ID!, $content: String!) {
|
||||
CreateComment(id: $id, postId: $postId, content: $content) {
|
||||
id
|
||||
content
|
||||
@ -128,7 +128,7 @@ describe('CreateComment', () => {
|
||||
|
||||
describe('UpdateComment', () => {
|
||||
const updateCommentMutation = gql`
|
||||
mutation($content: String!, $id: ID!) {
|
||||
mutation ($content: String!, $id: ID!) {
|
||||
UpdateComment(content: $content, id: $id) {
|
||||
id
|
||||
content
|
||||
@ -220,7 +220,7 @@ describe('UpdateComment', () => {
|
||||
|
||||
describe('DeleteComment', () => {
|
||||
const deleteCommentMutation = gql`
|
||||
mutation($id: ID!) {
|
||||
mutation ($id: ID!) {
|
||||
DeleteComment(id: $id) {
|
||||
id
|
||||
content
|
||||
|
||||
@ -9,7 +9,7 @@ const instance = getNeode()
|
||||
const driver = getDriver()
|
||||
|
||||
const updateDonationsMutation = gql`
|
||||
mutation($goal: Int, $progress: Int) {
|
||||
mutation ($goal: Int, $progress: Int) {
|
||||
UpdateDonations(goal: $goal, progress: $progress) {
|
||||
id
|
||||
goal
|
||||
|
||||
@ -6,6 +6,27 @@ import Validator from 'neode/build/Services/Validator.js'
|
||||
import normalizeEmail from './helpers/normalizeEmail'
|
||||
|
||||
export default {
|
||||
Query: {
|
||||
VerifyNonce: async (_parent, args, context, _resolveInfo) => {
|
||||
const session = context.driver.session()
|
||||
const readTxResultPromise = session.readTransaction(async (txc) => {
|
||||
const result = await txc.run(
|
||||
`
|
||||
MATCH (email:EmailAddress {email: $email, nonce: $nonce})
|
||||
RETURN count(email) > 0 AS result
|
||||
`,
|
||||
{ email: args.email, nonce: args.nonce },
|
||||
)
|
||||
return result
|
||||
})
|
||||
try {
|
||||
const txResult = await readTxResultPromise
|
||||
return txResult.records[0].get('result')
|
||||
} finally {
|
||||
session.close()
|
||||
}
|
||||
},
|
||||
},
|
||||
Mutation: {
|
||||
AddEmailAddress: async (_parent, args, context, _resolveInfo) => {
|
||||
let response
|
||||
|
||||
@ -6,7 +6,7 @@ import { createTestClient } from 'apollo-server-testing'
|
||||
|
||||
const neode = getNeode()
|
||||
|
||||
let mutate
|
||||
let mutate, query
|
||||
let authenticatedUser
|
||||
let user
|
||||
let variables
|
||||
@ -16,7 +16,8 @@ beforeEach(async () => {
|
||||
variables = {}
|
||||
})
|
||||
|
||||
beforeAll(() => {
|
||||
beforeAll(async () => {
|
||||
await cleanDatabase()
|
||||
const { server } = createServer({
|
||||
context: () => {
|
||||
return {
|
||||
@ -27,6 +28,7 @@ beforeAll(() => {
|
||||
},
|
||||
})
|
||||
mutate = createTestClient(server).mutate
|
||||
query = createTestClient(server).query
|
||||
})
|
||||
|
||||
afterEach(async () => {
|
||||
@ -35,7 +37,7 @@ afterEach(async () => {
|
||||
|
||||
describe('AddEmailAddress', () => {
|
||||
const mutation = gql`
|
||||
mutation($email: String!) {
|
||||
mutation ($email: String!) {
|
||||
AddEmailAddress(email: $email) {
|
||||
email
|
||||
verifiedAt
|
||||
@ -140,7 +142,7 @@ describe('AddEmailAddress', () => {
|
||||
|
||||
describe('VerifyEmailAddress', () => {
|
||||
const mutation = gql`
|
||||
mutation($email: String!, $nonce: String!) {
|
||||
mutation ($email: String!, $nonce: String!) {
|
||||
VerifyEmailAddress(email: $email, nonce: $nonce) {
|
||||
email
|
||||
createdAt
|
||||
@ -185,7 +187,7 @@ describe('VerifyEmailAddress', () => {
|
||||
let emailAddress
|
||||
beforeEach(async () => {
|
||||
emailAddress = await Factory.build('unverifiedEmailAddress', {
|
||||
nonce: 'abcdef',
|
||||
nonce: '12345',
|
||||
verifiedAt: null,
|
||||
createdAt: new Date().toISOString(),
|
||||
email: 'to-be-verified@example.org',
|
||||
@ -204,7 +206,7 @@ describe('VerifyEmailAddress', () => {
|
||||
|
||||
describe('given valid nonce for `UnverifiedEmailAddress` node', () => {
|
||||
beforeEach(() => {
|
||||
variables = { ...variables, nonce: 'abcdef' }
|
||||
variables = { ...variables, nonce: '12345' }
|
||||
})
|
||||
|
||||
describe('but the address does not belong to the authenticated user', () => {
|
||||
@ -295,3 +297,40 @@ describe('VerifyEmailAddress', () => {
|
||||
})
|
||||
})
|
||||
})
|
||||
|
||||
describe('VerifyNonce', () => {
|
||||
beforeEach(async () => {
|
||||
await Factory.build('emailAddress', {
|
||||
nonce: '12345',
|
||||
verifiedAt: null,
|
||||
createdAt: new Date().toISOString(),
|
||||
email: 'to-be-verified@example.org',
|
||||
})
|
||||
})
|
||||
|
||||
const verifyNonceQuery = gql`
|
||||
query ($email: String!, $nonce: String!) {
|
||||
VerifyNonce(email: $email, nonce: $nonce)
|
||||
}
|
||||
`
|
||||
|
||||
it('returns true when nonce and email match', async () => {
|
||||
variables = {
|
||||
email: 'to-be-verified@example.org',
|
||||
nonce: '12345',
|
||||
}
|
||||
await expect(query({ query: verifyNonceQuery, variables })).resolves.toMatchObject({
|
||||
data: { VerifyNonce: true },
|
||||
})
|
||||
})
|
||||
|
||||
it('returns false when nonce and email do not match', async () => {
|
||||
variables = {
|
||||
email: 'to-be-verified@example.org',
|
||||
nonce: '---',
|
||||
}
|
||||
await expect(query({ query: verifyNonceQuery, variables })).resolves.toMatchObject({
|
||||
data: { VerifyNonce: false },
|
||||
})
|
||||
})
|
||||
})
|
||||
|
||||
@ -37,8 +37,7 @@ const babyLovesCatEmbedResponse = new Response(
|
||||
thumbnail_height: 360,
|
||||
provider_url: 'https://www.youtube.com/',
|
||||
thumbnail_width: 480,
|
||||
html:
|
||||
'<iframe width="480" height="270" src="https://www.youtube.com/embed/qkdXAtO40Fo?start=18&feature=oembed" frameborder="0" allow="accelerometer; autoplay; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe>',
|
||||
html: '<iframe width="480" height="270" src="https://www.youtube.com/embed/qkdXAtO40Fo?start=18&feature=oembed" frameborder="0" allow="accelerometer; autoplay; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe>',
|
||||
thumbnail_url: 'https://i.ytimg.com/vi/qkdXAtO40Fo/hqdefault.jpg',
|
||||
version: '1.0',
|
||||
author_name: 'Merkley Family',
|
||||
@ -57,7 +56,7 @@ describe('Query', () => {
|
||||
})
|
||||
const { query } = createTestClient(server)
|
||||
const embed = gql`
|
||||
query($url: String!) {
|
||||
query ($url: String!) {
|
||||
embed(url: $url) {
|
||||
type
|
||||
title
|
||||
@ -204,8 +203,7 @@ Have all the information for the brand in separate config files. Set these defau
|
||||
video: null,
|
||||
lang: 'de',
|
||||
sources: ['resource', 'oembed'],
|
||||
html:
|
||||
'<iframe width="480" height="270" src="https://www.youtube.com/embed/qkdXAtO40Fo?start=18&feature=oembed" frameborder="0" allow="accelerometer; autoplay; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe>',
|
||||
html: '<iframe width="480" height="270" src="https://www.youtube.com/embed/qkdXAtO40Fo?start=18&feature=oembed" frameborder="0" allow="accelerometer; autoplay; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe>',
|
||||
},
|
||||
},
|
||||
errors: undefined,
|
||||
|
||||
@ -16,7 +16,7 @@ let user2
|
||||
let variables
|
||||
|
||||
const mutationFollowUser = gql`
|
||||
mutation($id: ID!) {
|
||||
mutation ($id: ID!) {
|
||||
followUser(id: $id) {
|
||||
name
|
||||
followedBy {
|
||||
@ -29,7 +29,7 @@ const mutationFollowUser = gql`
|
||||
`
|
||||
|
||||
const mutationUnfollowUser = gql`
|
||||
mutation($id: ID!) {
|
||||
mutation ($id: ID!) {
|
||||
unfollowUser(id: $id) {
|
||||
name
|
||||
followedBy {
|
||||
@ -42,7 +42,7 @@ const mutationUnfollowUser = gql`
|
||||
`
|
||||
|
||||
const userQuery = gql`
|
||||
query($id: ID) {
|
||||
query ($id: ID) {
|
||||
User(id: $id) {
|
||||
followedBy {
|
||||
id
|
||||
|
||||
@ -0,0 +1,8 @@
|
||||
export default function generateInviteCode() {
|
||||
// 6 random numbers in [ 0, 35 ] are 36 possible numbers (10 [0-9] + 26 [A-Z])
|
||||
return Array.from({ length: 6 }, (n = Math.floor(Math.random() * 36)) => {
|
||||
// n > 9: it is a letter (ASCII 65 is A) -> 10 + 55 = 65
|
||||
// else: it is a number (ASCII 48 is 0) -> 0 + 48 = 48
|
||||
return String.fromCharCode(n > 9 ? n + 55 : n + 48)
|
||||
}).join('')
|
||||
}
|
||||
@ -1,4 +1,5 @@
|
||||
import { v4 as uuid } from 'uuid'
|
||||
export default function generateNonce() {
|
||||
return uuid().substring(0, 6)
|
||||
return Array.from({ length: 5 }, (n = Math.floor(Math.random() * 10)) => {
|
||||
return String.fromCharCode(n + 48)
|
||||
}).join('')
|
||||
}
|
||||
|
||||
@ -2,7 +2,7 @@ import Resolver from './helpers/Resolver'
|
||||
export default {
|
||||
Image: {
|
||||
...Resolver('Image', {
|
||||
undefinedToNull: ['sensitive', 'alt', 'aspectRatio'],
|
||||
undefinedToNull: ['sensitive', 'alt', 'aspectRatio', 'type'],
|
||||
}),
|
||||
},
|
||||
}
|
||||
|
||||
@ -5,10 +5,10 @@ import slug from 'slug'
|
||||
import { existsSync, unlinkSync, createWriteStream } from 'fs'
|
||||
import { UserInputError } from 'apollo-server'
|
||||
import { getDriver } from '../../../db/neo4j'
|
||||
import { s3Configs } from '../../../config'
|
||||
import CONFIG from '../../../config'
|
||||
|
||||
// const widths = [34, 160, 320, 640, 1024]
|
||||
const { AWS_ENDPOINT: endpoint, AWS_REGION: region, AWS_BUCKET: Bucket, S3_CONFIGURED } = s3Configs
|
||||
const { AWS_ENDPOINT: endpoint, AWS_REGION: region, AWS_BUCKET: Bucket, S3_CONFIGURED } = CONFIG
|
||||
|
||||
export async function deleteImage(resource, relationshipType, opts = {}) {
|
||||
sanitizeRelationshipType(relationshipType)
|
||||
@ -53,8 +53,8 @@ export async function mergeImage(resource, relationshipType, imageInput, opts =
|
||||
if (!(existingImage || upload)) throw new UserInputError('Cannot find image for given resource')
|
||||
if (existingImage && upload) deleteImageFile(existingImage, deleteCallback)
|
||||
const url = await uploadImageFile(upload, uploadCallback)
|
||||
const { alt, sensitive, aspectRatio } = imageInput
|
||||
const image = { alt, sensitive, aspectRatio, url }
|
||||
const { alt, sensitive, aspectRatio, type } = imageInput
|
||||
const image = { alt, sensitive, aspectRatio, url, type }
|
||||
txResult = await transaction.run(
|
||||
`
|
||||
MATCH (resource {id: $resource.id})
|
||||
|
||||
137
backend/src/schema/resolvers/inviteCodes.js
Normal file
137
backend/src/schema/resolvers/inviteCodes.js
Normal file
@ -0,0 +1,137 @@
|
||||
import generateInviteCode from './helpers/generateInviteCode'
|
||||
import Resolver from './helpers/Resolver'
|
||||
import { validateInviteCode } from './transactions/inviteCodes'
|
||||
|
||||
const uniqueInviteCode = async (session, code) => {
|
||||
return session.readTransaction(async (txc) => {
|
||||
const result = await txc.run(`MATCH (ic:InviteCode { id: $code }) RETURN count(ic) AS count`, {
|
||||
code,
|
||||
})
|
||||
return parseInt(String(result.records[0].get('count'))) === 0
|
||||
})
|
||||
}
|
||||
|
||||
export default {
|
||||
Query: {
|
||||
getInviteCode: async (_parent, args, context, _resolveInfo) => {
|
||||
const {
|
||||
user: { id: userId },
|
||||
} = context
|
||||
const session = context.driver.session()
|
||||
const readTxResultPromise = session.readTransaction(async (txc) => {
|
||||
const result = await txc.run(
|
||||
`MATCH (user:User {id: $userId})-[:GENERATED]->(ic:InviteCode)
|
||||
WHERE ic.expiresAt IS NULL
|
||||
OR datetime(ic.expiresAt) >= datetime()
|
||||
RETURN properties(ic) AS inviteCodes`,
|
||||
{
|
||||
userId,
|
||||
},
|
||||
)
|
||||
return result.records.map((record) => record.get('inviteCodes'))
|
||||
})
|
||||
try {
|
||||
const inviteCode = await readTxResultPromise
|
||||
if (inviteCode && inviteCode.length > 0) return inviteCode[0]
|
||||
let code = generateInviteCode()
|
||||
while (!(await uniqueInviteCode(session, code))) {
|
||||
code = generateInviteCode()
|
||||
}
|
||||
const writeTxResultPromise = session.writeTransaction(async (txc) => {
|
||||
const result = await txc.run(
|
||||
`MATCH (user:User {id: $userId})
|
||||
MERGE (user)-[:GENERATED]->(ic:InviteCode { code: $code })
|
||||
ON CREATE SET
|
||||
ic.createdAt = toString(datetime()),
|
||||
ic.expiresAt = $expiresAt
|
||||
RETURN ic AS inviteCode`,
|
||||
{
|
||||
userId,
|
||||
code,
|
||||
expiresAt: null,
|
||||
},
|
||||
)
|
||||
return result.records.map((record) => record.get('inviteCode').properties)
|
||||
})
|
||||
const txResult = await writeTxResultPromise
|
||||
return txResult[0]
|
||||
} finally {
|
||||
session.close()
|
||||
}
|
||||
},
|
||||
MyInviteCodes: async (_parent, args, context, _resolveInfo) => {
|
||||
const {
|
||||
user: { id: userId },
|
||||
} = context
|
||||
const session = context.driver.session()
|
||||
const readTxResultPromise = session.readTransaction(async (txc) => {
|
||||
const result = await txc.run(
|
||||
`MATCH (user:User {id: $userId})-[:GENERATED]->(ic:InviteCode)
|
||||
RETURN properties(ic) AS inviteCodes`,
|
||||
{
|
||||
userId,
|
||||
},
|
||||
)
|
||||
return result.records.map((record) => record.get('inviteCodes'))
|
||||
})
|
||||
try {
|
||||
const txResult = await readTxResultPromise
|
||||
return txResult
|
||||
} finally {
|
||||
session.close()
|
||||
}
|
||||
},
|
||||
isValidInviteCode: async (_parent, args, context, _resolveInfo) => {
|
||||
const { code } = args
|
||||
const session = context.driver.session()
|
||||
if (!code) return false
|
||||
return validateInviteCode(session, code)
|
||||
},
|
||||
},
|
||||
Mutation: {
|
||||
GenerateInviteCode: async (_parent, args, context, _resolveInfo) => {
|
||||
const {
|
||||
user: { id: userId },
|
||||
} = context
|
||||
const session = context.driver.session()
|
||||
let code = generateInviteCode()
|
||||
while (!(await uniqueInviteCode(session, code))) {
|
||||
code = generateInviteCode()
|
||||
}
|
||||
const writeTxResultPromise = session.writeTransaction(async (txc) => {
|
||||
const result = await txc.run(
|
||||
`MATCH (user:User {id: $userId})
|
||||
MERGE (user)-[:GENERATED]->(ic:InviteCode { code: $code })
|
||||
ON CREATE SET
|
||||
ic.createdAt = toString(datetime()),
|
||||
ic.expiresAt = $expiresAt
|
||||
RETURN ic AS inviteCode`,
|
||||
{
|
||||
userId,
|
||||
code,
|
||||
expiresAt: args.expiresAt,
|
||||
},
|
||||
)
|
||||
return result.records.map((record) => record.get('inviteCode').properties)
|
||||
})
|
||||
try {
|
||||
const txResult = await writeTxResultPromise
|
||||
return txResult[0]
|
||||
} finally {
|
||||
session.close()
|
||||
}
|
||||
},
|
||||
},
|
||||
InviteCode: {
|
||||
...Resolver('InviteCode', {
|
||||
idAttribute: 'code',
|
||||
undefinedToNull: ['expiresAt'],
|
||||
hasOne: {
|
||||
generatedBy: '<-[:GENERATED]-(related:User)',
|
||||
},
|
||||
hasMany: {
|
||||
redeemedBy: '<-[:REDEEMED]-(related:User)',
|
||||
},
|
||||
}),
|
||||
},
|
||||
}
|
||||
200
backend/src/schema/resolvers/inviteCodes.spec.js
Normal file
200
backend/src/schema/resolvers/inviteCodes.spec.js
Normal file
@ -0,0 +1,200 @@
|
||||
import Factory, { cleanDatabase } from '../../db/factories'
|
||||
import { getDriver } from '../../db/neo4j'
|
||||
import { gql } from '../../helpers/jest'
|
||||
import createServer from '../../server'
|
||||
import { createTestClient } from 'apollo-server-testing'
|
||||
|
||||
let user
|
||||
let query
|
||||
let mutate
|
||||
|
||||
const driver = getDriver()
|
||||
|
||||
const generateInviteCodeMutation = gql`
|
||||
mutation ($expiresAt: String = null) {
|
||||
GenerateInviteCode(expiresAt: $expiresAt) {
|
||||
code
|
||||
createdAt
|
||||
expiresAt
|
||||
}
|
||||
}
|
||||
`
|
||||
|
||||
const myInviteCodesQuery = gql`
|
||||
query {
|
||||
MyInviteCodes {
|
||||
code
|
||||
createdAt
|
||||
expiresAt
|
||||
}
|
||||
}
|
||||
`
|
||||
|
||||
const isValidInviteCodeQuery = gql`
|
||||
query ($code: ID!) {
|
||||
isValidInviteCode(code: $code)
|
||||
}
|
||||
`
|
||||
|
||||
beforeAll(async () => {
|
||||
await cleanDatabase()
|
||||
const { server } = createServer({
|
||||
context: () => {
|
||||
return {
|
||||
driver,
|
||||
user,
|
||||
}
|
||||
},
|
||||
})
|
||||
query = createTestClient(server).query
|
||||
mutate = createTestClient(server).mutate
|
||||
})
|
||||
|
||||
afterAll(async () => {
|
||||
await cleanDatabase()
|
||||
})
|
||||
|
||||
describe('inviteCodes', () => {
|
||||
describe('as unauthenticated user', () => {
|
||||
it('cannot generate invite codes', async () => {
|
||||
await expect(mutate({ mutation: generateInviteCodeMutation })).resolves.toEqual(
|
||||
expect.objectContaining({
|
||||
errors: expect.arrayContaining([
|
||||
expect.objectContaining({
|
||||
extensions: { code: 'INTERNAL_SERVER_ERROR' },
|
||||
}),
|
||||
]),
|
||||
data: {
|
||||
GenerateInviteCode: null,
|
||||
},
|
||||
}),
|
||||
)
|
||||
})
|
||||
|
||||
it('cannot query invite codes', async () => {
|
||||
await expect(query({ query: myInviteCodesQuery })).resolves.toEqual(
|
||||
expect.objectContaining({
|
||||
errors: expect.arrayContaining([
|
||||
expect.objectContaining({
|
||||
extensions: { code: 'INTERNAL_SERVER_ERROR' },
|
||||
}),
|
||||
]),
|
||||
data: {
|
||||
MyInviteCodes: null,
|
||||
},
|
||||
}),
|
||||
)
|
||||
})
|
||||
})
|
||||
|
||||
describe('as authenticated user', () => {
|
||||
beforeAll(async () => {
|
||||
const authenticatedUser = await Factory.build(
|
||||
'user',
|
||||
{
|
||||
role: 'user',
|
||||
},
|
||||
{
|
||||
email: 'user@example.org',
|
||||
password: '1234',
|
||||
},
|
||||
)
|
||||
user = await authenticatedUser.toJson()
|
||||
})
|
||||
|
||||
it('generates an invite code without expiresAt', async () => {
|
||||
await expect(mutate({ mutation: generateInviteCodeMutation })).resolves.toEqual(
|
||||
expect.objectContaining({
|
||||
errors: undefined,
|
||||
data: {
|
||||
GenerateInviteCode: {
|
||||
code: expect.stringMatching(/^[0-9A-Z]{6,6}$/),
|
||||
expiresAt: null,
|
||||
createdAt: expect.any(String),
|
||||
},
|
||||
},
|
||||
}),
|
||||
)
|
||||
})
|
||||
|
||||
it('generates an invite code with expiresAt', async () => {
|
||||
const nextWeek = new Date()
|
||||
nextWeek.setDate(nextWeek.getDate() + 7)
|
||||
await expect(
|
||||
mutate({
|
||||
mutation: generateInviteCodeMutation,
|
||||
variables: { expiresAt: nextWeek.toISOString() },
|
||||
}),
|
||||
).resolves.toEqual(
|
||||
expect.objectContaining({
|
||||
errors: undefined,
|
||||
data: {
|
||||
GenerateInviteCode: {
|
||||
code: expect.stringMatching(/^[0-9A-Z]{6,6}$/),
|
||||
expiresAt: nextWeek.toISOString(),
|
||||
createdAt: expect.any(String),
|
||||
},
|
||||
},
|
||||
}),
|
||||
)
|
||||
})
|
||||
|
||||
let inviteCodes
|
||||
|
||||
it('returns the created invite codes when queried', async () => {
|
||||
const response = await query({ query: myInviteCodesQuery })
|
||||
inviteCodes = response.data.MyInviteCodes
|
||||
expect(inviteCodes).toHaveLength(2)
|
||||
})
|
||||
|
||||
it('does not return the created invite codes of other users when queried', async () => {
|
||||
await Factory.build('inviteCode')
|
||||
const response = await query({ query: myInviteCodesQuery })
|
||||
inviteCodes = response.data.MyInviteCodes
|
||||
expect(inviteCodes).toHaveLength(2)
|
||||
})
|
||||
|
||||
it('validates an invite code without expiresAt', async () => {
|
||||
const unExpiringInviteCode = inviteCodes.filter((ic) => ic.expiresAt === null)[0].code
|
||||
const result = await query({
|
||||
query: isValidInviteCodeQuery,
|
||||
variables: { code: unExpiringInviteCode },
|
||||
})
|
||||
expect(result.data.isValidInviteCode).toBeTruthy()
|
||||
})
|
||||
|
||||
it('validates an invite code in lower case', async () => {
|
||||
const unExpiringInviteCode = inviteCodes.filter((ic) => ic.expiresAt === null)[0].code
|
||||
const result = await query({
|
||||
query: isValidInviteCodeQuery,
|
||||
variables: { code: unExpiringInviteCode.toLowerCase() },
|
||||
})
|
||||
expect(result.data.isValidInviteCode).toBeTruthy()
|
||||
})
|
||||
|
||||
it('validates an invite code with expiresAt in the future', async () => {
|
||||
const expiringInviteCode = inviteCodes.filter((ic) => ic.expiresAt !== null)[0].code
|
||||
const result = await query({
|
||||
query: isValidInviteCodeQuery,
|
||||
variables: { code: expiringInviteCode },
|
||||
})
|
||||
expect(result.data.isValidInviteCode).toBeTruthy()
|
||||
})
|
||||
|
||||
it('does not validate an invite code which expired in the past', async () => {
|
||||
const lastWeek = new Date()
|
||||
lastWeek.setDate(lastWeek.getDate() - 7)
|
||||
const inviteCode = await Factory.build('inviteCode', {
|
||||
expiresAt: lastWeek.toISOString(),
|
||||
})
|
||||
const code = inviteCode.get('code')
|
||||
const result = await query({ query: isValidInviteCodeQuery, variables: { code } })
|
||||
expect(result.data.isValidInviteCode).toBeFalsy()
|
||||
})
|
||||
|
||||
it('does not validate an invite code which does not exits', async () => {
|
||||
const result = await query({ query: isValidInviteCodeQuery, variables: { code: 'AAA' } })
|
||||
expect(result.data.isValidInviteCode).toBeFalsy()
|
||||
})
|
||||
})
|
||||
})
|
||||
@ -1,4 +1,6 @@
|
||||
import { UserInputError } from 'apollo-server'
|
||||
import Resolver from './helpers/Resolver'
|
||||
import { queryLocations } from './users/location'
|
||||
|
||||
export default {
|
||||
Location: {
|
||||
@ -16,4 +18,13 @@ export default {
|
||||
],
|
||||
}),
|
||||
},
|
||||
Query: {
|
||||
queryLocations: async (object, args, context, resolveInfo) => {
|
||||
try {
|
||||
return queryLocations(args)
|
||||
} catch (e) {
|
||||
throw new UserInputError(e.message)
|
||||
}
|
||||
},
|
||||
},
|
||||
}
|
||||
|
||||
@ -31,7 +31,7 @@ describe('resolvers', () => {
|
||||
describe('custom mutation, not handled by neo4j-graphql-js', () => {
|
||||
let variables
|
||||
const updateUserMutation = gql`
|
||||
mutation($id: ID!, $name: String) {
|
||||
mutation ($id: ID!, $name: String) {
|
||||
UpdateUser(id: $id, name: $name) {
|
||||
name
|
||||
location {
|
||||
|
||||
@ -16,7 +16,7 @@ let mutate,
|
||||
closeReportVariables
|
||||
|
||||
const reviewMutation = gql`
|
||||
mutation($resourceId: ID!, $disable: Boolean, $closed: Boolean) {
|
||||
mutation ($resourceId: ID!, $disable: Boolean, $closed: Boolean) {
|
||||
review(resourceId: $resourceId, disable: $disable, closed: $closed) {
|
||||
createdAt
|
||||
updatedAt
|
||||
|
||||
@ -139,7 +139,7 @@ describe('given some notifications', () => {
|
||||
|
||||
describe('notifications', () => {
|
||||
const notificationQuery = gql`
|
||||
query($read: Boolean, $orderBy: NotificationOrdering) {
|
||||
query ($read: Boolean, $orderBy: NotificationOrdering) {
|
||||
notifications(read: $read, orderBy: $orderBy) {
|
||||
from {
|
||||
__typename
|
||||
@ -249,7 +249,7 @@ describe('given some notifications', () => {
|
||||
const deletePostAction = async () => {
|
||||
authenticatedUser = await author.toJson()
|
||||
const deletePostMutation = gql`
|
||||
mutation($id: ID!) {
|
||||
mutation ($id: ID!) {
|
||||
DeletePost(id: $id) {
|
||||
id
|
||||
deleted
|
||||
@ -284,7 +284,7 @@ describe('given some notifications', () => {
|
||||
|
||||
describe('markAsRead', () => {
|
||||
const markAsReadMutation = gql`
|
||||
mutation($id: ID!) {
|
||||
mutation ($id: ID!) {
|
||||
markAsRead(id: $id) {
|
||||
from {
|
||||
__typename
|
||||
|
||||
@ -55,7 +55,7 @@ describe('passwordReset', () => {
|
||||
|
||||
describe('requestPasswordReset', () => {
|
||||
const mutation = gql`
|
||||
mutation($email: String!) {
|
||||
mutation ($email: String!) {
|
||||
requestPasswordReset(email: $email)
|
||||
}
|
||||
`
|
||||
@ -116,7 +116,7 @@ describe('resetPassword', () => {
|
||||
}
|
||||
|
||||
const mutation = gql`
|
||||
mutation($nonce: String!, $email: String!, $newPassword: String!) {
|
||||
mutation ($nonce: String!, $email: String!, $newPassword: String!) {
|
||||
resetPassword(nonce: $nonce, email: $email, newPassword: $newPassword)
|
||||
}
|
||||
`
|
||||
@ -196,7 +196,7 @@ describe('resetPassword', () => {
|
||||
it('updates password of the user', async () => {
|
||||
await mutate({ mutation, variables })
|
||||
const checkLoginMutation = gql`
|
||||
mutation($email: String!, $password: String!) {
|
||||
mutation ($email: String!, $password: String!) {
|
||||
login(email: $email, password: $password)
|
||||
}
|
||||
`
|
||||
|
||||
@ -76,7 +76,6 @@ export default {
|
||||
},
|
||||
Mutation: {
|
||||
CreatePost: async (_parent, params, context, _resolveInfo) => {
|
||||
const { categoryIds } = params
|
||||
const { image: imageInput } = params
|
||||
delete params.categoryIds
|
||||
delete params.image
|
||||
@ -89,16 +88,14 @@ export default {
|
||||
SET post += $params
|
||||
SET post.createdAt = toString(datetime())
|
||||
SET post.updatedAt = toString(datetime())
|
||||
SET post.clickedCount = 0
|
||||
SET post.viewedTeaserCount = 0
|
||||
WITH post
|
||||
MATCH (author:User {id: $userId})
|
||||
MERGE (post)<-[:WROTE]-(author)
|
||||
WITH post
|
||||
UNWIND $categoryIds AS categoryId
|
||||
MATCH (category:Category {id: categoryId})
|
||||
MERGE (post)-[:CATEGORIZED]->(category)
|
||||
RETURN post {.*}
|
||||
`,
|
||||
{ userId: context.user.id, categoryIds, params },
|
||||
{ userId: context.user.id, params },
|
||||
)
|
||||
const [post] = createPostTransactionResponse.records.map((record) => record.get('post'))
|
||||
if (imageInput) {
|
||||
@ -320,13 +317,38 @@ export default {
|
||||
}
|
||||
return unpinnedPost
|
||||
},
|
||||
markTeaserAsViewed: async (_parent, params, context, _resolveInfo) => {
|
||||
const session = context.driver.session()
|
||||
const writeTxResultPromise = session.writeTransaction(async (transaction) => {
|
||||
const transactionResponse = await transaction.run(
|
||||
`
|
||||
MATCH (post:Post { id: $params.id })
|
||||
MATCH (user:User { id: $userId })
|
||||
MERGE (user)-[relation:VIEWED_TEASER { }]->(post)
|
||||
ON CREATE
|
||||
SET relation.createdAt = toString(datetime()),
|
||||
post.viewedTeaserCount = post.viewedTeaserCount + 1
|
||||
RETURN post
|
||||
`,
|
||||
{ userId: context.user.id, params },
|
||||
)
|
||||
return transactionResponse.records.map((record) => record.get('post').properties)
|
||||
})
|
||||
try {
|
||||
const [post] = await writeTxResultPromise
|
||||
post.viewedTeaserCount = post.viewedTeaserCount.low
|
||||
return post
|
||||
} finally {
|
||||
session.close()
|
||||
}
|
||||
},
|
||||
},
|
||||
Post: {
|
||||
...Resolver('Post', {
|
||||
undefinedToNull: ['activityId', 'objectId', 'language', 'pinnedAt', 'pinned'],
|
||||
hasMany: {
|
||||
tags: '-[:TAGGED]->(related:Tag)',
|
||||
categories: '-[:CATEGORIZED]->(related:Category)',
|
||||
// categories: '-[:CATEGORIZED]->(related:Category)',
|
||||
comments: '<-[:COMMENTS]-(related:Comment)',
|
||||
shoutedBy: '<-[:SHOUTED]-(related:User)',
|
||||
emotions: '<-[related:EMOTED]',
|
||||
@ -346,6 +368,8 @@ export default {
|
||||
boolean: {
|
||||
shoutedByCurrentUser:
|
||||
'MATCH(this)<-[:SHOUTED]-(related:User {id: $cypherParams.currentUserId}) RETURN COUNT(related) >= 1',
|
||||
viewedTeaserByCurrentUser:
|
||||
'MATCH (this)<-[:VIEWED_TEASER]-(u:User {id: $cypherParams.currentUserId}) RETURN COUNT(u) >= 1',
|
||||
},
|
||||
}),
|
||||
relatedContributions: async (parent, params, context, resolveInfo) => {
|
||||
|
||||
@ -16,7 +16,7 @@ const categoryIds = ['cat9', 'cat4', 'cat15']
|
||||
let variables
|
||||
|
||||
const createPostMutation = gql`
|
||||
mutation($id: ID, $title: String!, $content: String!, $language: String, $categoryIds: [ID]) {
|
||||
mutation ($id: ID, $title: String!, $content: String!, $language: String, $categoryIds: [ID]) {
|
||||
CreatePost(
|
||||
id: $id
|
||||
title: $title
|
||||
@ -147,7 +147,7 @@ describe('Post', () => {
|
||||
})
|
||||
})
|
||||
|
||||
it('by categories', async () => {
|
||||
/* it('by categories', async () => {
|
||||
const postQueryFilteredByCategories = gql`
|
||||
query Post($filter: _PostFilter) {
|
||||
Post(filter: $filter) {
|
||||
@ -172,7 +172,7 @@ describe('Post', () => {
|
||||
await expect(
|
||||
query({ query: postQueryFilteredByCategories, variables }),
|
||||
).resolves.toMatchObject(expected)
|
||||
})
|
||||
}) */
|
||||
|
||||
describe('by emotions', () => {
|
||||
const postQueryFilteredByEmotions = gql`
|
||||
@ -317,33 +317,14 @@ describe('CreatePost', () => {
|
||||
expected,
|
||||
)
|
||||
})
|
||||
|
||||
describe('language', () => {
|
||||
beforeEach(() => {
|
||||
variables = { ...variables, language: 'es' }
|
||||
})
|
||||
|
||||
it('allows a user to set the language of the post', async () => {
|
||||
const expected = { data: { CreatePost: { language: 'es' } } }
|
||||
await expect(mutate({ mutation: createPostMutation, variables })).resolves.toMatchObject(
|
||||
expected,
|
||||
)
|
||||
})
|
||||
})
|
||||
})
|
||||
})
|
||||
|
||||
describe('UpdatePost', () => {
|
||||
let author, newlyCreatedPost
|
||||
const updatePostMutation = gql`
|
||||
mutation($id: ID!, $title: String!, $content: String!, $categoryIds: [ID], $image: ImageInput) {
|
||||
UpdatePost(
|
||||
id: $id
|
||||
title: $title
|
||||
content: $content
|
||||
categoryIds: $categoryIds
|
||||
image: $image
|
||||
) {
|
||||
mutation ($id: ID!, $title: String!, $content: String!, $image: ImageInput) {
|
||||
UpdatePost(id: $id, title: $title, content: $content, image: $image) {
|
||||
id
|
||||
title
|
||||
content
|
||||
@ -351,9 +332,6 @@ describe('UpdatePost', () => {
|
||||
name
|
||||
slug
|
||||
}
|
||||
categories {
|
||||
id
|
||||
}
|
||||
createdAt
|
||||
updatedAt
|
||||
}
|
||||
@ -441,7 +419,7 @@ describe('UpdatePost', () => {
|
||||
expect(newlyCreatedPost.updatedAt).not.toEqual(UpdatePost.updatedAt)
|
||||
})
|
||||
|
||||
describe('no new category ids provided for update', () => {
|
||||
/* describe('no new category ids provided for update', () => {
|
||||
it('resolves and keeps current categories', async () => {
|
||||
const expected = {
|
||||
data: {
|
||||
@ -456,9 +434,9 @@ describe('UpdatePost', () => {
|
||||
expected,
|
||||
)
|
||||
})
|
||||
})
|
||||
}) */
|
||||
|
||||
describe('given category ids', () => {
|
||||
/* describe('given category ids', () => {
|
||||
beforeEach(() => {
|
||||
variables = { ...variables, categoryIds: ['cat27'] }
|
||||
})
|
||||
@ -477,7 +455,7 @@ describe('UpdatePost', () => {
|
||||
expected,
|
||||
)
|
||||
})
|
||||
})
|
||||
}) */
|
||||
|
||||
describe('params.image', () => {
|
||||
describe('is object', () => {
|
||||
@ -519,7 +497,7 @@ describe('UpdatePost', () => {
|
||||
describe('pin posts', () => {
|
||||
let author
|
||||
const pinPostMutation = gql`
|
||||
mutation($id: ID!) {
|
||||
mutation ($id: ID!) {
|
||||
pinPost(id: $id) {
|
||||
id
|
||||
title
|
||||
@ -795,7 +773,7 @@ describe('pin posts', () => {
|
||||
|
||||
it('pinned post appear first even when created before other posts', async () => {
|
||||
const postOrderingQuery = gql`
|
||||
query($orderBy: [_PostOrdering]) {
|
||||
query ($orderBy: [_PostOrdering]) {
|
||||
Post(orderBy: $orderBy) {
|
||||
id
|
||||
pinned
|
||||
@ -838,7 +816,7 @@ describe('pin posts', () => {
|
||||
describe('unpin posts', () => {
|
||||
let pinnedPost
|
||||
const unpinPostMutation = gql`
|
||||
mutation($id: ID!) {
|
||||
mutation ($id: ID!) {
|
||||
unpinPost(id: $id) {
|
||||
id
|
||||
title
|
||||
@ -950,7 +928,7 @@ describe('unpin posts', () => {
|
||||
describe('DeletePost', () => {
|
||||
let author
|
||||
const deletePostMutation = gql`
|
||||
mutation($id: ID!) {
|
||||
mutation ($id: ID!) {
|
||||
DeletePost(id: $id) {
|
||||
id
|
||||
deleted
|
||||
@ -1074,14 +1052,14 @@ describe('DeletePost', () => {
|
||||
describe('emotions', () => {
|
||||
let author, postToEmote
|
||||
const PostsEmotionsCountQuery = gql`
|
||||
query($id: ID!) {
|
||||
query ($id: ID!) {
|
||||
Post(id: $id) {
|
||||
emotionsCount
|
||||
}
|
||||
}
|
||||
`
|
||||
const PostsEmotionsQuery = gql`
|
||||
query($id: ID!) {
|
||||
query ($id: ID!) {
|
||||
Post(id: $id) {
|
||||
emotions {
|
||||
emotion
|
||||
@ -1115,7 +1093,7 @@ describe('emotions', () => {
|
||||
|
||||
describe('AddPostEmotions', () => {
|
||||
const addPostEmotionsMutation = gql`
|
||||
mutation($to: _PostInput!, $data: _EMOTEDInput!) {
|
||||
mutation ($to: _PostInput!, $data: _EMOTEDInput!) {
|
||||
AddPostEmotions(to: $to, data: $data) {
|
||||
from {
|
||||
id
|
||||
@ -1232,7 +1210,7 @@ describe('emotions', () => {
|
||||
describe('RemovePostEmotions', () => {
|
||||
let removePostEmotionsVariables, postsEmotionsQueryVariables
|
||||
const removePostEmotionsMutation = gql`
|
||||
mutation($to: _PostInput!, $data: _EMOTEDInput!) {
|
||||
mutation ($to: _PostInput!, $data: _EMOTEDInput!) {
|
||||
RemovePostEmotions(to: $to, data: $data) {
|
||||
from {
|
||||
id
|
||||
@ -1331,13 +1309,13 @@ describe('emotions', () => {
|
||||
let PostsEmotionsByCurrentUserVariables
|
||||
|
||||
const PostsEmotionsCountByEmotionQuery = gql`
|
||||
query($postId: ID!, $data: _EMOTEDInput!) {
|
||||
query ($postId: ID!, $data: _EMOTEDInput!) {
|
||||
PostsEmotionsCountByEmotion(postId: $postId, data: $data)
|
||||
}
|
||||
`
|
||||
|
||||
const PostsEmotionsByCurrentUserQuery = gql`
|
||||
query($postId: ID!) {
|
||||
query ($postId: ID!) {
|
||||
PostsEmotionsByCurrentUser(postId: $postId)
|
||||
}
|
||||
`
|
||||
|
||||
@ -29,34 +29,22 @@ export default {
|
||||
}
|
||||
args.termsAndConditionsAgreedAt = new Date().toISOString()
|
||||
|
||||
let { nonce, email } = args
|
||||
let { nonce, email, inviteCode } = args
|
||||
email = normalizeEmail(email)
|
||||
delete args.nonce
|
||||
delete args.email
|
||||
delete args.inviteCode
|
||||
args = encryptPassword(args)
|
||||
|
||||
const { driver } = context
|
||||
const session = driver.session()
|
||||
const writeTxResultPromise = session.writeTransaction(async (transaction) => {
|
||||
const createUserTransactionResponse = await transaction.run(
|
||||
`
|
||||
MATCH(email:EmailAddress {nonce: $nonce, email: $email})
|
||||
WHERE NOT (email)-[:BELONGS_TO]->()
|
||||
CREATE (user:User)
|
||||
MERGE(user)-[:PRIMARY_EMAIL]->(email)
|
||||
MERGE(user)<-[:BELONGS_TO]-(email)
|
||||
SET user += $args
|
||||
SET user.id = randomUUID()
|
||||
SET user.role = 'user'
|
||||
SET user.createdAt = toString(datetime())
|
||||
SET user.updatedAt = toString(datetime())
|
||||
SET user.allowEmbedIframes = FALSE
|
||||
SET user.showShoutsPublicly = FALSE
|
||||
SET email.verifiedAt = toString(datetime())
|
||||
RETURN user {.*}
|
||||
`,
|
||||
{ args, nonce, email },
|
||||
)
|
||||
const createUserTransactionResponse = await transaction.run(signupCypher(inviteCode), {
|
||||
args,
|
||||
nonce,
|
||||
email,
|
||||
inviteCode,
|
||||
})
|
||||
const [user] = createUserTransactionResponse.records.map((record) => record.get('user'))
|
||||
if (!user) throw new UserInputError('Invalid email or nonce')
|
||||
return user
|
||||
@ -74,3 +62,39 @@ export default {
|
||||
},
|
||||
},
|
||||
}
|
||||
|
||||
const signupCypher = (inviteCode) => {
|
||||
let optionalMatch = ''
|
||||
let optionalMerge = ''
|
||||
if (inviteCode) {
|
||||
optionalMatch = `
|
||||
OPTIONAL MATCH
|
||||
(inviteCode:InviteCode {code: $inviteCode})<-[:GENERATED]-(host:User)
|
||||
`
|
||||
optionalMerge = `
|
||||
MERGE(user)-[:REDEEMED { createdAt: toString(datetime()) }]->(inviteCode)
|
||||
MERGE(host)-[:INVITED { createdAt: toString(datetime()) }]->(user)
|
||||
MERGE(user)-[:FOLLOWS { createdAt: toString(datetime()) }]->(host)
|
||||
MERGE(host)-[:FOLLOWS { createdAt: toString(datetime()) }]->(user)
|
||||
`
|
||||
}
|
||||
const cypher = `
|
||||
MATCH(email:EmailAddress {nonce: $nonce, email: $email})
|
||||
WHERE NOT (email)-[:BELONGS_TO]->()
|
||||
${optionalMatch}
|
||||
CREATE (user:User)
|
||||
MERGE(user)-[:PRIMARY_EMAIL]->(email)
|
||||
MERGE(user)<-[:BELONGS_TO]-(email)
|
||||
${optionalMerge}
|
||||
SET user += $args
|
||||
SET user.id = randomUUID()
|
||||
SET user.role = 'user'
|
||||
SET user.createdAt = toString(datetime())
|
||||
SET user.updatedAt = toString(datetime())
|
||||
SET user.allowEmbedIframes = FALSE
|
||||
SET user.showShoutsPublicly = FALSE
|
||||
SET email.verifiedAt = toString(datetime())
|
||||
RETURN user {.*}
|
||||
`
|
||||
return cypher
|
||||
}
|
||||
|
||||
@ -3,6 +3,7 @@ import { gql } from '../../helpers/jest'
|
||||
import { getDriver, getNeode } from '../../db/neo4j'
|
||||
import createServer from '../../server'
|
||||
import { createTestClient } from 'apollo-server-testing'
|
||||
import CONFIG from '../../config'
|
||||
|
||||
const neode = getNeode()
|
||||
|
||||
@ -15,7 +16,8 @@ beforeEach(async () => {
|
||||
variables = {}
|
||||
})
|
||||
|
||||
beforeAll(() => {
|
||||
beforeAll(async () => {
|
||||
await cleanDatabase()
|
||||
const { server } = createServer({
|
||||
context: () => {
|
||||
return {
|
||||
@ -34,8 +36,8 @@ afterEach(async () => {
|
||||
|
||||
describe('Signup', () => {
|
||||
const mutation = gql`
|
||||
mutation($email: String!) {
|
||||
Signup(email: $email) {
|
||||
mutation ($email: String!, $inviteCode: String) {
|
||||
Signup(email: $email, inviteCode: $inviteCode) {
|
||||
email
|
||||
}
|
||||
}
|
||||
@ -50,6 +52,8 @@ describe('Signup', () => {
|
||||
})
|
||||
|
||||
it('throws AuthorizationError', async () => {
|
||||
CONFIG.INVITE_REGISTRATION = false
|
||||
CONFIG.PUBLIC_REGISTRATION = false
|
||||
await expect(mutate({ mutation, variables })).resolves.toMatchObject({
|
||||
errors: [{ message: 'Not Authorised!' }],
|
||||
})
|
||||
@ -141,7 +145,7 @@ describe('Signup', () => {
|
||||
|
||||
describe('SignupVerification', () => {
|
||||
const mutation = gql`
|
||||
mutation(
|
||||
mutation (
|
||||
$name: String!
|
||||
$password: String!
|
||||
$email: String!
|
||||
|
||||
@ -11,7 +11,7 @@ describe('file a report on a resource', () => {
|
||||
let authenticatedUser, currentUser, mutate, query, moderator, abusiveUser, otherReportingUser
|
||||
const categoryIds = ['cat9']
|
||||
const fileReportMutation = gql`
|
||||
mutation($resourceId: ID!, $reasonCategory: ReasonCategory!, $reasonDescription: String!) {
|
||||
mutation ($resourceId: ID!, $reasonCategory: ReasonCategory!, $reasonDescription: String!) {
|
||||
fileReport(
|
||||
resourceId: $resourceId
|
||||
reasonCategory: $reasonCategory
|
||||
@ -42,7 +42,7 @@ describe('file a report on a resource', () => {
|
||||
reasonDescription: 'Violates code of conduct !!!',
|
||||
}
|
||||
const reportsQuery = gql`
|
||||
query($closed: Boolean) {
|
||||
query ($closed: Boolean) {
|
||||
reports(orderBy: createdAt_desc, closed: $closed) {
|
||||
id
|
||||
createdAt
|
||||
@ -74,7 +74,7 @@ describe('file a report on a resource', () => {
|
||||
}
|
||||
`
|
||||
const reviewMutation = gql`
|
||||
mutation($resourceId: ID!, $disable: Boolean, $closed: Boolean) {
|
||||
mutation ($resourceId: ID!, $disable: Boolean, $closed: Boolean) {
|
||||
review(resourceId: $resourceId, disable: $disable, closed: $closed) {
|
||||
createdAt
|
||||
resource {
|
||||
|
||||
@ -16,6 +16,7 @@ describe('rewards', () => {
|
||||
}
|
||||
|
||||
beforeAll(async () => {
|
||||
await cleanDatabase()
|
||||
const { server } = createServer({
|
||||
context: () => {
|
||||
return {
|
||||
@ -75,7 +76,7 @@ describe('rewards', () => {
|
||||
|
||||
describe('reward', () => {
|
||||
const rewardMutation = gql`
|
||||
mutation($from: ID!, $to: ID!) {
|
||||
mutation ($from: ID!, $to: ID!) {
|
||||
reward(badgeKey: $from, userId: $to) {
|
||||
id
|
||||
badges {
|
||||
@ -265,7 +266,7 @@ describe('rewards', () => {
|
||||
}
|
||||
|
||||
const unrewardMutation = gql`
|
||||
mutation($from: ID!, $to: ID!) {
|
||||
mutation ($from: ID!, $to: ID!) {
|
||||
unreward(badgeKey: $from, userId: $to) {
|
||||
id
|
||||
badges {
|
||||
|
||||
7
backend/src/schema/resolvers/roles.js
Normal file
7
backend/src/schema/resolvers/roles.js
Normal file
@ -0,0 +1,7 @@
|
||||
export default {
|
||||
Query: {
|
||||
availableRoles: async (_parent, args, context, _resolveInfo) => {
|
||||
return ['admin', 'moderator', 'user']
|
||||
},
|
||||
},
|
||||
}
|
||||
@ -3,90 +3,202 @@ import { queryString } from './searches/queryString'
|
||||
|
||||
// see http://lucene.apache.org/core/8_3_1/queryparser/org/apache/lucene/queryparser/classic/package-summary.html#package.description
|
||||
|
||||
const cypherTemplate = (setup) => `
|
||||
CALL db.index.fulltext.queryNodes('${setup.fulltextIndex}', $query)
|
||||
YIELD node AS resource, score
|
||||
${setup.match}
|
||||
${setup.whereClause}
|
||||
${setup.withClause}
|
||||
RETURN
|
||||
${setup.returnClause}
|
||||
AS result
|
||||
SKIP $skip
|
||||
${setup.limit}
|
||||
`
|
||||
|
||||
const simpleWhereClause =
|
||||
'WHERE score >= 0.0 AND NOT (resource.deleted = true OR resource.disabled = true)'
|
||||
|
||||
const postWhereClause = `WHERE score >= 0.0
|
||||
AND NOT (
|
||||
author.deleted = true OR author.disabled = true
|
||||
OR resource.deleted = true OR resource.disabled = true
|
||||
OR (:User {id: $userId})-[:MUTED]->(author)
|
||||
)`
|
||||
|
||||
const searchPostsSetup = {
|
||||
fulltextIndex: 'post_fulltext_search',
|
||||
match: 'MATCH (resource:Post)<-[:WROTE]-(author:User)',
|
||||
whereClause: postWhereClause,
|
||||
withClause: `WITH resource, author,
|
||||
[(resource)<-[:COMMENTS]-(comment:Comment) | comment] AS comments,
|
||||
[(resource)<-[:SHOUTED]-(user:User) | user] AS shouter`,
|
||||
returnClause: `resource {
|
||||
.*,
|
||||
__typename: labels(resource)[0],
|
||||
author: properties(author),
|
||||
commentsCount: toString(size(comments)),
|
||||
shoutedCount: toString(size(shouter)),
|
||||
clickedCount: toString(resource.clickedCount),
|
||||
viewedTeaserCount: toString(resource.viewedTeaserCount)
|
||||
}`,
|
||||
limit: 'LIMIT $limit',
|
||||
}
|
||||
|
||||
const searchUsersSetup = {
|
||||
fulltextIndex: 'user_fulltext_search',
|
||||
match: 'MATCH (resource:User)',
|
||||
whereClause: simpleWhereClause,
|
||||
withClause: '',
|
||||
returnClause: 'resource {.*, __typename: labels(resource)[0]}',
|
||||
limit: 'LIMIT $limit',
|
||||
}
|
||||
|
||||
const searchHashtagsSetup = {
|
||||
fulltextIndex: 'tag_fulltext_search',
|
||||
match: 'MATCH (resource:Tag)',
|
||||
whereClause: simpleWhereClause,
|
||||
withClause: '',
|
||||
returnClause: 'resource {.*, __typename: labels(resource)[0]}',
|
||||
limit: 'LIMIT $limit',
|
||||
}
|
||||
|
||||
const countSetup = {
|
||||
returnClause: 'toString(size(collect(resource)))',
|
||||
limit: '',
|
||||
}
|
||||
|
||||
const countUsersSetup = {
|
||||
...searchUsersSetup,
|
||||
...countSetup,
|
||||
}
|
||||
const countPostsSetup = {
|
||||
...searchPostsSetup,
|
||||
...countSetup,
|
||||
}
|
||||
const countHashtagsSetup = {
|
||||
...searchHashtagsSetup,
|
||||
...countSetup,
|
||||
}
|
||||
|
||||
const searchResultPromise = async (session, setup, params) => {
|
||||
return session.readTransaction(async (transaction) => {
|
||||
return transaction.run(cypherTemplate(setup), params)
|
||||
})
|
||||
}
|
||||
|
||||
const searchResultCallback = (result) => {
|
||||
return result.records.map((r) => r.get('result'))
|
||||
}
|
||||
|
||||
const countResultCallback = (result) => {
|
||||
return result.records[0].get('result')
|
||||
}
|
||||
|
||||
const getSearchResults = async (context, setup, params, resultCallback = searchResultCallback) => {
|
||||
const session = context.driver.session()
|
||||
try {
|
||||
const results = await searchResultPromise(session, setup, params)
|
||||
log(results)
|
||||
return resultCallback(results)
|
||||
} finally {
|
||||
session.close()
|
||||
}
|
||||
}
|
||||
|
||||
const multiSearchMap = [
|
||||
{ symbol: '!', setup: searchPostsSetup, resultName: 'posts' },
|
||||
{ symbol: '@', setup: searchUsersSetup, resultName: 'users' },
|
||||
{ symbol: '#', setup: searchHashtagsSetup, resultName: 'hashtags' },
|
||||
]
|
||||
|
||||
export default {
|
||||
Query: {
|
||||
findResources: async (_parent, args, context, _resolveInfo) => {
|
||||
searchPosts: async (_parent, args, context, _resolveInfo) => {
|
||||
const { query, postsOffset, firstPosts } = args
|
||||
const { id: userId } = context.user
|
||||
|
||||
return {
|
||||
postCount: getSearchResults(
|
||||
context,
|
||||
countPostsSetup,
|
||||
{
|
||||
query: queryString(query),
|
||||
skip: 0,
|
||||
userId,
|
||||
},
|
||||
countResultCallback,
|
||||
),
|
||||
posts: getSearchResults(context, searchPostsSetup, {
|
||||
query: queryString(query),
|
||||
skip: postsOffset,
|
||||
limit: firstPosts,
|
||||
userId,
|
||||
}),
|
||||
}
|
||||
},
|
||||
searchUsers: async (_parent, args, context, _resolveInfo) => {
|
||||
const { query, usersOffset, firstUsers } = args
|
||||
return {
|
||||
userCount: getSearchResults(
|
||||
context,
|
||||
countUsersSetup,
|
||||
{
|
||||
query: queryString(query),
|
||||
skip: 0,
|
||||
},
|
||||
countResultCallback,
|
||||
),
|
||||
users: getSearchResults(context, searchUsersSetup, {
|
||||
query: queryString(query),
|
||||
skip: usersOffset,
|
||||
limit: firstUsers,
|
||||
}),
|
||||
}
|
||||
},
|
||||
searchHashtags: async (_parent, args, context, _resolveInfo) => {
|
||||
const { query, hashtagsOffset, firstHashtags } = args
|
||||
return {
|
||||
hashtagCount: getSearchResults(
|
||||
context,
|
||||
countHashtagsSetup,
|
||||
{
|
||||
query: queryString(query),
|
||||
skip: 0,
|
||||
},
|
||||
countResultCallback,
|
||||
),
|
||||
hashtags: getSearchResults(context, searchHashtagsSetup, {
|
||||
query: queryString(query),
|
||||
skip: hashtagsOffset,
|
||||
limit: firstHashtags,
|
||||
}),
|
||||
}
|
||||
},
|
||||
searchResults: async (_parent, args, context, _resolveInfo) => {
|
||||
const { query, limit } = args
|
||||
const { id: thisUserId } = context.user
|
||||
const { id: userId } = context.user
|
||||
|
||||
const postCypher = `
|
||||
CALL db.index.fulltext.queryNodes('post_fulltext_search', $query)
|
||||
YIELD node as resource, score
|
||||
MATCH (resource)<-[:WROTE]-(author:User)
|
||||
WHERE score >= 0.0
|
||||
AND NOT (
|
||||
author.deleted = true OR author.disabled = true
|
||||
OR resource.deleted = true OR resource.disabled = true
|
||||
OR (:User {id: $thisUserId})-[:MUTED]->(author)
|
||||
)
|
||||
WITH resource, author,
|
||||
[(resource)<-[:COMMENTS]-(comment:Comment) | comment] as comments,
|
||||
[(resource)<-[:SHOUTED]-(user:User) | user] as shouter
|
||||
RETURN resource {
|
||||
.*,
|
||||
__typename: labels(resource)[0],
|
||||
author: properties(author),
|
||||
commentsCount: toString(size(comments)),
|
||||
shoutedCount: toString(size(shouter))
|
||||
const searchType = query.replace(/^([!@#]?).*$/, '$1')
|
||||
const searchString = query.replace(/^([!@#])/, '')
|
||||
|
||||
const params = {
|
||||
query: queryString(searchString),
|
||||
skip: 0,
|
||||
limit,
|
||||
userId,
|
||||
}
|
||||
LIMIT $limit
|
||||
`
|
||||
|
||||
const userCypher = `
|
||||
CALL db.index.fulltext.queryNodes('user_fulltext_search', $query)
|
||||
YIELD node as resource, score
|
||||
MATCH (resource)
|
||||
WHERE score >= 0.0
|
||||
AND NOT (resource.deleted = true OR resource.disabled = true)
|
||||
RETURN resource {.*, __typename: labels(resource)[0]}
|
||||
LIMIT $limit
|
||||
`
|
||||
const tagCypher = `
|
||||
CALL db.index.fulltext.queryNodes('tag_fulltext_search', $query)
|
||||
YIELD node as resource, score
|
||||
MATCH (resource)
|
||||
WHERE score >= 0.0
|
||||
AND NOT (resource.deleted = true OR resource.disabled = true)
|
||||
RETURN resource {.*, __typename: labels(resource)[0]}
|
||||
LIMIT $limit
|
||||
`
|
||||
if (searchType === '')
|
||||
return [
|
||||
...(await getSearchResults(context, searchPostsSetup, params)),
|
||||
...(await getSearchResults(context, searchUsersSetup, params)),
|
||||
...(await getSearchResults(context, searchHashtagsSetup, params)),
|
||||
]
|
||||
|
||||
const myQuery = queryString(query)
|
||||
|
||||
const session = context.driver.session()
|
||||
const searchResultPromise = session.readTransaction(async (transaction) => {
|
||||
const postTransactionResponse = transaction.run(postCypher, {
|
||||
query: myQuery,
|
||||
limit,
|
||||
thisUserId,
|
||||
})
|
||||
const userTransactionResponse = transaction.run(userCypher, {
|
||||
query: myQuery,
|
||||
limit,
|
||||
thisUserId,
|
||||
})
|
||||
const tagTransactionResponse = transaction.run(tagCypher, {
|
||||
query: myQuery,
|
||||
limit,
|
||||
})
|
||||
return Promise.all([
|
||||
postTransactionResponse,
|
||||
userTransactionResponse,
|
||||
tagTransactionResponse,
|
||||
])
|
||||
})
|
||||
|
||||
try {
|
||||
const [postResults, userResults, tagResults] = await searchResultPromise
|
||||
log(postResults)
|
||||
log(userResults)
|
||||
log(tagResults)
|
||||
return [...postResults.records, ...userResults.records, ...tagResults.records].map((r) =>
|
||||
r.get('resource'),
|
||||
)
|
||||
} finally {
|
||||
session.close()
|
||||
}
|
||||
params.limit = 15
|
||||
const type = multiSearchMap.find((obj) => obj.symbol === searchType)
|
||||
return getSearchResults(context, type.setup, params)
|
||||
},
|
||||
},
|
||||
}
|
||||
|
||||
@ -28,8 +28,8 @@ afterAll(async () => {
|
||||
})
|
||||
|
||||
const searchQuery = gql`
|
||||
query($query: String!) {
|
||||
findResources(query: $query, limit: 5) {
|
||||
query ($query: String!) {
|
||||
searchResults(query: $query, limit: 5) {
|
||||
__typename
|
||||
... on Post {
|
||||
id
|
||||
@ -47,6 +47,21 @@ const searchQuery = gql`
|
||||
}
|
||||
}
|
||||
`
|
||||
|
||||
const searchPostQuery = gql`
|
||||
query ($query: String!, $firstPosts: Int, $postsOffset: Int) {
|
||||
searchPosts(query: $query, firstPosts: $firstPosts, postsOffset: $postsOffset) {
|
||||
postCount
|
||||
posts {
|
||||
__typename
|
||||
id
|
||||
title
|
||||
content
|
||||
}
|
||||
}
|
||||
}
|
||||
`
|
||||
|
||||
describe('resolvers/searches', () => {
|
||||
let variables
|
||||
|
||||
@ -65,7 +80,7 @@ describe('resolvers/searches', () => {
|
||||
variables = { query: 'John' }
|
||||
await expect(query({ query: searchQuery, variables })).resolves.toMatchObject({
|
||||
data: {
|
||||
findResources: [
|
||||
searchResults: [
|
||||
{
|
||||
id: 'a-user',
|
||||
name: 'John Doe',
|
||||
@ -95,7 +110,7 @@ describe('resolvers/searches', () => {
|
||||
variables = { query: 'beitrag' }
|
||||
await expect(query({ query: searchQuery, variables })).resolves.toMatchObject({
|
||||
data: {
|
||||
findResources: [
|
||||
searchResults: [
|
||||
{
|
||||
__typename: 'Post',
|
||||
id: 'a-post',
|
||||
@ -114,7 +129,7 @@ describe('resolvers/searches', () => {
|
||||
variables = { query: 'BEITRAG' }
|
||||
await expect(query({ query: searchQuery, variables })).resolves.toMatchObject({
|
||||
data: {
|
||||
findResources: [
|
||||
searchResults: [
|
||||
{
|
||||
__typename: 'Post',
|
||||
id: 'a-post',
|
||||
@ -132,7 +147,7 @@ describe('resolvers/searches', () => {
|
||||
it('returns empty search results', async () => {
|
||||
await expect(
|
||||
query({ query: searchQuery, variables: { query: 'Unfug' } }),
|
||||
).resolves.toMatchObject({ data: { findResources: [] } })
|
||||
).resolves.toMatchObject({ data: { searchResults: [] } })
|
||||
})
|
||||
})
|
||||
|
||||
@ -189,7 +204,7 @@ und hinter tausend Stäben keine Welt.`,
|
||||
variables = { query: 'beitrag' }
|
||||
await expect(query({ query: searchQuery, variables })).resolves.toMatchObject({
|
||||
data: {
|
||||
findResources: expect.arrayContaining([
|
||||
searchResults: expect.arrayContaining([
|
||||
{
|
||||
__typename: 'Post',
|
||||
id: 'a-post',
|
||||
@ -216,7 +231,7 @@ und hinter tausend Stäben keine Welt.`,
|
||||
variables = { query: 'tee-ei' }
|
||||
await expect(query({ query: searchQuery, variables })).resolves.toMatchObject({
|
||||
data: {
|
||||
findResources: [
|
||||
searchResults: [
|
||||
{
|
||||
__typename: 'Post',
|
||||
id: 'g-post',
|
||||
@ -235,7 +250,7 @@ und hinter tausend Stäben keine Welt.`,
|
||||
variables = { query: '„teeei“' }
|
||||
await expect(query({ query: searchQuery, variables })).resolves.toMatchObject({
|
||||
data: {
|
||||
findResources: [
|
||||
searchResults: [
|
||||
{
|
||||
__typename: 'Post',
|
||||
id: 'g-post',
|
||||
@ -256,7 +271,7 @@ und hinter tausend Stäben keine Welt.`,
|
||||
variables = { query: '(a - b)²' }
|
||||
await expect(query({ query: searchQuery, variables })).resolves.toMatchObject({
|
||||
data: {
|
||||
findResources: [
|
||||
searchResults: [
|
||||
{
|
||||
__typename: 'Post',
|
||||
id: 'c-post',
|
||||
@ -277,7 +292,7 @@ und hinter tausend Stäben keine Welt.`,
|
||||
variables = { query: '(a-b)²' }
|
||||
await expect(query({ query: searchQuery, variables })).resolves.toMatchObject({
|
||||
data: {
|
||||
findResources: [
|
||||
searchResults: [
|
||||
{
|
||||
__typename: 'Post',
|
||||
id: 'c-post',
|
||||
@ -298,7 +313,7 @@ und hinter tausend Stäben keine Welt.`,
|
||||
variables = { query: '+ b² 2.' }
|
||||
await expect(query({ query: searchQuery, variables })).resolves.toMatchObject({
|
||||
data: {
|
||||
findResources: [
|
||||
searchResults: [
|
||||
{
|
||||
__typename: 'Post',
|
||||
id: 'c-post',
|
||||
@ -321,7 +336,7 @@ und hinter tausend Stäben keine Welt.`,
|
||||
variables = { query: 'der panther' }
|
||||
await expect(query({ query: searchQuery, variables })).resolves.toMatchObject({
|
||||
data: {
|
||||
findResources: [
|
||||
searchResults: [
|
||||
{
|
||||
__typename: 'Post',
|
||||
id: 'd-post',
|
||||
@ -349,7 +364,7 @@ und hinter tausend Stäben keine Welt.`,
|
||||
variables = { query: 'Vorü Subs' }
|
||||
await expect(query({ query: searchQuery, variables })).resolves.toMatchObject({
|
||||
data: {
|
||||
findResources: expect.arrayContaining([
|
||||
searchResults: expect.arrayContaining([
|
||||
{
|
||||
__typename: 'Post',
|
||||
id: 'd-post',
|
||||
@ -395,7 +410,7 @@ und hinter tausend Stäben keine Welt.`,
|
||||
variables = { query: '-maria-' }
|
||||
await expect(query({ query: searchQuery, variables })).resolves.toMatchObject({
|
||||
data: {
|
||||
findResources: expect.arrayContaining([
|
||||
searchResults: expect.arrayContaining([
|
||||
{
|
||||
__typename: 'User',
|
||||
id: 'c-user',
|
||||
@ -416,6 +431,128 @@ und hinter tausend Stäben keine Welt.`,
|
||||
})
|
||||
})
|
||||
|
||||
describe('adding a user and a hashtag with a name that is content of a post', () => {
|
||||
beforeAll(async () => {
|
||||
await Promise.all([
|
||||
Factory.build('user', {
|
||||
id: 'f-user',
|
||||
name: 'Peter Panther',
|
||||
slug: 'peter-panther',
|
||||
}),
|
||||
await Factory.build('tag', { id: 'Panther' }),
|
||||
])
|
||||
})
|
||||
|
||||
describe('query the word that contains the post, the hashtag and the name of the user', () => {
|
||||
it('finds the user, the post and the hashtag', async () => {
|
||||
variables = { query: 'panther' }
|
||||
await expect(query({ query: searchQuery, variables })).resolves.toMatchObject({
|
||||
data: {
|
||||
searchResults: expect.arrayContaining([
|
||||
{
|
||||
__typename: 'User',
|
||||
id: 'f-user',
|
||||
name: 'Peter Panther',
|
||||
slug: 'peter-panther',
|
||||
},
|
||||
{
|
||||
__typename: 'Post',
|
||||
id: 'd-post',
|
||||
title: 'Der Panther',
|
||||
content: `Sein Blick ist vom Vorübergehn der Stäbe<br>
|
||||
so müd geworden, daß er nichts mehr hält.<br>
|
||||
Ihm ist, als ob es tausend Stäbe gäbe<br>
|
||||
und hinter tausend Stäben keine Welt.`,
|
||||
},
|
||||
{
|
||||
__typename: 'Tag',
|
||||
id: 'Panther',
|
||||
},
|
||||
]),
|
||||
},
|
||||
errors: undefined,
|
||||
})
|
||||
})
|
||||
})
|
||||
|
||||
describe('@query the word that contains the post, the hashtag and the name of the user', () => {
|
||||
it('only finds the user', async () => {
|
||||
variables = { query: '@panther' }
|
||||
await expect(query({ query: searchQuery, variables })).resolves.toMatchObject({
|
||||
data: {
|
||||
searchResults: expect.not.arrayContaining([
|
||||
{
|
||||
__typename: 'Post',
|
||||
id: 'd-post',
|
||||
title: 'Der Panther',
|
||||
content: `Sein Blick ist vom Vorübergehn der Stäbe<br>
|
||||
so müd geworden, daß er nichts mehr hält.<br>
|
||||
Ihm ist, als ob es tausend Stäbe gäbe<br>
|
||||
und hinter tausend Stäben keine Welt.`,
|
||||
},
|
||||
{
|
||||
__typename: 'Tag',
|
||||
id: 'Panther',
|
||||
},
|
||||
]),
|
||||
},
|
||||
errors: undefined,
|
||||
})
|
||||
})
|
||||
})
|
||||
|
||||
describe('!query the word that contains the post, the hashtag and the name of the user', () => {
|
||||
it('only finds the post', async () => {
|
||||
variables = { query: '!panther' }
|
||||
await expect(query({ query: searchQuery, variables })).resolves.toMatchObject({
|
||||
data: {
|
||||
searchResults: expect.not.arrayContaining([
|
||||
{
|
||||
__typename: 'User',
|
||||
id: 'f-user',
|
||||
name: 'Peter Panther',
|
||||
slug: 'peter-panther',
|
||||
},
|
||||
{
|
||||
__typename: 'Tag',
|
||||
id: 'Panther',
|
||||
},
|
||||
]),
|
||||
},
|
||||
errors: undefined,
|
||||
})
|
||||
})
|
||||
})
|
||||
|
||||
describe('#query the word that contains the post, the hashtag and the name of the user', () => {
|
||||
it('only finds the hashtag', async () => {
|
||||
variables = { query: '#panther' }
|
||||
await expect(query({ query: searchQuery, variables })).resolves.toMatchObject({
|
||||
data: {
|
||||
searchResults: expect.not.arrayContaining([
|
||||
{
|
||||
__typename: 'User',
|
||||
id: 'f-user',
|
||||
name: 'Peter Panther',
|
||||
slug: 'peter-panther',
|
||||
},
|
||||
{
|
||||
__typename: 'Post',
|
||||
id: 'd-post',
|
||||
title: 'Der Panther',
|
||||
content: `Sein Blick ist vom Vorübergehn der Stäbe<br>
|
||||
so müd geworden, daß er nichts mehr hält.<br>
|
||||
Ihm ist, als ob es tausend Stäbe gäbe<br>
|
||||
und hinter tausend Stäben keine Welt.`,
|
||||
},
|
||||
]),
|
||||
},
|
||||
errors: undefined,
|
||||
})
|
||||
})
|
||||
})
|
||||
})
|
||||
|
||||
describe('adding a post, written by a user who is muted by the authenticated user', () => {
|
||||
beforeAll(async () => {
|
||||
const mutedUser = await Factory.build('user', {
|
||||
@ -440,7 +577,7 @@ und hinter tausend Stäben keine Welt.`,
|
||||
variables = { query: 'beitrag' }
|
||||
await expect(query({ query: searchQuery, variables })).resolves.toMatchObject({
|
||||
data: {
|
||||
findResources: expect.not.arrayContaining([
|
||||
searchResults: expect.not.arrayContaining([
|
||||
{
|
||||
__typename: 'Post',
|
||||
id: 'muted-post',
|
||||
@ -465,7 +602,7 @@ und hinter tausend Stäben keine Welt.`,
|
||||
variables = { query: 'myha' }
|
||||
await expect(query({ query: searchQuery, variables })).resolves.toMatchObject({
|
||||
data: {
|
||||
findResources: [
|
||||
searchResults: [
|
||||
{
|
||||
__typename: 'Tag',
|
||||
id: 'myHashtag',
|
||||
@ -477,6 +614,30 @@ und hinter tausend Stäben keine Welt.`,
|
||||
})
|
||||
})
|
||||
})
|
||||
|
||||
describe('searchPostQuery', () => {
|
||||
describe('query with limit 1', () => {
|
||||
it('has a count greater than 1', async () => {
|
||||
variables = { query: 'beitrag', firstPosts: 1, postsOffset: 0 }
|
||||
await expect(query({ query: searchPostQuery, variables })).resolves.toMatchObject({
|
||||
data: {
|
||||
searchPosts: {
|
||||
postCount: 2,
|
||||
posts: [
|
||||
{
|
||||
__typename: 'Post',
|
||||
id: 'a-post',
|
||||
title: 'Beitrag',
|
||||
content: 'Ein erster Beitrag',
|
||||
},
|
||||
],
|
||||
},
|
||||
},
|
||||
errors: undefined,
|
||||
})
|
||||
})
|
||||
})
|
||||
})
|
||||
})
|
||||
})
|
||||
})
|
||||
|
||||
@ -39,7 +39,11 @@ const matchBeginningOfWords = (str) => {
|
||||
}
|
||||
|
||||
export function normalizeWhitespace(str) {
|
||||
return str.replace(/\s+/g, ' ').trim()
|
||||
// delete the first character if it is !, @ or #
|
||||
return str
|
||||
.replace(/^([!@#])/, '')
|
||||
.replace(/\s+/g, ' ')
|
||||
.trim()
|
||||
}
|
||||
|
||||
export function escapeSpecialCharacters(str) {
|
||||
|
||||
@ -9,17 +9,17 @@ const instance = getNeode()
|
||||
const driver = getDriver()
|
||||
|
||||
const mutationShoutPost = gql`
|
||||
mutation($id: ID!) {
|
||||
mutation ($id: ID!) {
|
||||
shout(id: $id, type: Post)
|
||||
}
|
||||
`
|
||||
const mutationUnshoutPost = gql`
|
||||
mutation($id: ID!) {
|
||||
mutation ($id: ID!) {
|
||||
unshout(id: $id, type: Post)
|
||||
}
|
||||
`
|
||||
const queryPost = gql`
|
||||
query($id: ID!) {
|
||||
query ($id: ID!) {
|
||||
Post(id: $id) {
|
||||
id
|
||||
shoutedBy {
|
||||
|
||||
@ -70,7 +70,7 @@ describe('SocialMedia', () => {
|
||||
|
||||
beforeEach(() => {
|
||||
mutation = gql`
|
||||
mutation($url: String!) {
|
||||
mutation ($url: String!) {
|
||||
CreateSocialMedia(url: $url) {
|
||||
id
|
||||
url
|
||||
@ -131,7 +131,7 @@ describe('SocialMedia', () => {
|
||||
describe('ownedBy', () => {
|
||||
beforeEach(() => {
|
||||
mutation = gql`
|
||||
mutation($url: String!) {
|
||||
mutation ($url: String!) {
|
||||
CreateSocialMedia(url: $url) {
|
||||
url
|
||||
ownedBy {
|
||||
@ -162,7 +162,7 @@ describe('SocialMedia', () => {
|
||||
const socialMedia = await setUpSocialMedia()
|
||||
|
||||
mutation = gql`
|
||||
mutation($id: ID!, $url: String!) {
|
||||
mutation ($id: ID!, $url: String!) {
|
||||
UpdateSocialMedia(id: $id, url: $url) {
|
||||
id
|
||||
url
|
||||
@ -225,7 +225,7 @@ describe('SocialMedia', () => {
|
||||
const socialMedia = await setUpSocialMedia()
|
||||
|
||||
mutation = gql`
|
||||
mutation($id: ID!) {
|
||||
mutation ($id: ID!) {
|
||||
DeleteSocialMedia(id: $id) {
|
||||
id
|
||||
url
|
||||
|
||||
@ -21,7 +21,7 @@ const statisticsQuery = gql`
|
||||
}
|
||||
}
|
||||
`
|
||||
beforeAll(() => {
|
||||
beforeAll(async () => {
|
||||
authenticatedUser = undefined
|
||||
const { server } = createServer({
|
||||
context: () => {
|
||||
@ -33,6 +33,7 @@ beforeAll(() => {
|
||||
},
|
||||
})
|
||||
query = createTestClient(server).query
|
||||
await cleanDatabase()
|
||||
})
|
||||
|
||||
afterEach(async () => {
|
||||
|
||||
Some files were not shown because too many files have changed in this diff Show More
Loading…
x
Reference in New Issue
Block a user