Compare commits

...

271 Commits

Author SHA1 Message Date
Andras Bacsai
ef91441c76 Merge pull request #603 from coollabsio/next
v3.10.0
2022-09-08 15:41:46 +02:00
Andras Bacsai
aa6c56b63d do not show servers to non admins 2022-09-08 15:23:01 +02:00
Andras Bacsai
18e899d15e disable remote servers for now 2022-09-08 15:00:54 +02:00
Andras Bacsai
63fa8924ae fix: autoupdater 2022-09-08 14:46:08 +02:00
Andras Bacsai
0e13e3bd81 feat: new servers view 2022-09-08 14:42:04 +02:00
Andras Bacsai
372c0ed457 fix: glitchtip env to pyhton boolean 2022-09-08 12:04:08 +02:00
Andras Bacsai
071077200b ui: fix tooltip 2022-09-08 11:57:15 +02:00
Andras Bacsai
65579a2861 Merge branch 'next' of github.com:coollabsio/coolify into next 2022-09-08 11:47:09 +02:00
Andras Bacsai
bb7603ae2a Merge pull request #601 from c0ldfront/fixes
fix typo located in postCreateCommand in devcontainers.json
2022-09-08 11:45:54 +02:00
Andras Bacsai
cce67d274e fix: ui variables 2022-09-08 11:44:23 +02:00
Andras Bacsai
794329dcad fix: port checkers 2022-09-08 11:41:38 +02:00
Andras Bacsai
e36fda3ff1 fix: ispublic status on databases 2022-09-08 10:54:24 +02:00
Andras Bacsai
3832d33259 fix: save search input 2022-09-08 10:45:12 +02:00
Andras Bacsai
7350524456 ui: dashboard updates 2022-09-08 10:30:13 +02:00
Andras Bacsai
a1a973a873 fix: change to execa from utils 2022-09-08 09:32:50 +02:00
David Mydlarz
f2a915700c fix type located in postCreateCommand in devcontainers.json 2022-09-08 13:33:18 +09:00
Andras Bacsai
e184f99617 Merge branch 'main' into next 2022-09-07 20:32:27 +00:00
Andras Bacsai
ab07adb14f fix: Settings for service 2022-09-07 20:32:15 +00:00
Andras Bacsai
6535c68276 Merge branch 'main' into next 2022-09-07 20:15:26 +00:00
Andras Bacsai
dde2772e52 fix: dnsServer formatting 2022-09-07 20:14:49 +00:00
Andras Bacsai
ac72c19d22 Merge branch 'main' into next 2022-09-07 15:40:37 +00:00
Andras Bacsai
67fc2fd3c0 fix: pr previews 2022-09-07 15:29:56 +00:00
Andras Bacsai
4acc59204c ui: dashboard updates and a lot more 2022-09-07 15:59:37 +02:00
Andras Bacsai
07cadb59e0 fix: edgedb 2022-09-07 11:40:58 +02:00
Andras Bacsai
6fa4741c81 feat: database secrets 2022-09-07 11:40:52 +02:00
Andras Bacsai
f4bac2382c fix: edgedb stuff 2022-09-07 11:02:13 +02:00
Andras Bacsai
67b72220c0 feat: add traefik acme json to coolify container 2022-09-07 11:01:04 +02:00
Andras Bacsai
feeb14ea47 fix: edgedb ui 2022-09-07 10:49:49 +02:00
Andras Bacsai
bdfb6f5f46 chore: version++ 2022-09-07 10:23:08 +02:00
Andras Bacsai
53491e9eaa Merge pull request #522 from ikezedev/edge-db
[New Database]: EdgeDB
2022-09-07 10:21:47 +02:00
Andras Bacsai
f720de65fa Update _DatabaseLinks.svelte 2022-09-07 10:21:12 +02:00
Andras Bacsai
3d70162a8d Merge branch 'next' into edge-db 2022-09-07 10:20:40 +02:00
Andras Bacsai
d3a1bbc3d0 Merge pull request #596 from coollabsio/next
v3.9.2
2022-09-07 10:16:54 +02:00
Andras Bacsai
0078574ee6 fix: add php 8.1/8.2 2022-09-07 10:02:59 +02:00
Andras Bacsai
7cfd313531 ui: fix loading start/stop db/services 2022-09-07 09:45:16 +02:00
Andras Bacsai
e7919e9a1b ui: fix initial loading icon bg 2022-09-07 09:39:29 +02:00
Andras Bacsai
98073202e9 fix: minio default env variables 2022-09-07 09:28:53 +02:00
Andras Bacsai
8dee345f85 enable autoupdate option for everyone 2022-09-07 09:23:03 +02:00
Andras Bacsai
9161882f33 debug: add debug log 2022-09-07 09:21:28 +02:00
Andras Bacsai
eef313665b fix: service volume generation 2022-09-07 09:18:44 +02:00
Andras Bacsai
53e70fbfcb dev: update devcontainer 2022-09-07 09:15:10 +02:00
Andras Bacsai
05a1721499 fix: revert last change with domain check 2022-09-07 09:15:03 +02:00
Andras Bacsai
2f772080b8 fix: add initial DNS servers 2022-09-07 09:06:30 +02:00
Andras Bacsai
a5548c080c fix: service state update 2022-09-07 08:58:51 +02:00
Andras Bacsai
7e0a1ecc80 ui: fix login/register page 2022-09-07 08:58:40 +02:00
Andras Bacsai
3f2dcccc07 fix: use ip instead of window location host 2022-09-07 08:58:27 +02:00
Andras Bacsai
adc5965b32 fix: use ip address instead of window location 2022-09-07 08:57:32 +02:00
Andras Bacsai
6088f2e573 chore: version++ 2022-09-06 15:46:22 +02:00
Andras Bacsai
fc705746c0 fix: gitlab webhook 2022-09-06 15:45:29 +02:00
Andras Bacsai
8182359fe4 Merge branch 'main' into next 2022-09-06 15:41:22 +02:00
Andras Bacsai
e7ae15162c Update GH Actions prod release 2022-09-06 15:40:16 +02:00
Andras Bacsai
12ca20432d v3.9.1 (#594) 2022-09-06 15:33:32 +02:00
Andras Bacsai
8b7406e168 revert 2022-09-06 15:25:18 +02:00
Andras Bacsai
9d6317f782 fix 2022-09-06 15:13:51 +02:00
Andras Bacsai
d8bdb73140 test tagging 2022-09-06 14:59:24 +02:00
Andras Bacsai
476db15431 test 2022-09-06 14:49:48 +02:00
Andras Bacsai
20ce356296 fix: move restart button to settings 2022-09-06 14:47:37 +02:00
Andras Bacsai
ea594dcbc6 fix: workdir 2022-09-06 14:32:50 +02:00
Andras Bacsai
021b9746a8 update production release 2022-09-06 14:29:54 +02:00
Andras Bacsai
c4615ae557 Dockerfile 2022-09-06 14:20:14 +02:00
Andras Bacsai
95a5089bdc fix: debug api logging + gh actions 2022-09-06 14:08:10 +02:00
Andras Bacsai
cef1fba281 finally?! 2022-09-06 13:23:53 +02:00
Andras Bacsai
5c7859a258 asd 2022-09-06 13:08:29 +02:00
Andras Bacsai
986cdae5b0 asd 2022-09-06 13:01:55 +02:00
Andras Bacsai
3b11e28d6c asd 2022-09-06 12:56:02 +02:00
Andras Bacsai
eba63e8e76 fix 2022-09-06 12:54:38 +02:00
Andras Bacsai
2fc65e3b42 grr 2022-09-06 12:04:58 +02:00
Andras Bacsai
7d504ab2bf fixxxx 2022-09-06 12:02:03 +02:00
Andras Bacsai
216c7efd42 fixxxxx 2022-09-06 11:55:16 +02:00
Andras Bacsai
8c4149db16 fix issues 2022-09-06 11:42:00 +02:00
Andras Bacsai
20ac8f69ea Update dockerfile 2022-09-06 11:38:36 +02:00
Andras Bacsai
1db3d7a6fb fixit 2022-09-06 11:31:54 +02:00
Andras Bacsai
b1c1138cf8 fixit now 2022-09-06 11:31:02 +02:00
Andras Bacsai
00b1a4f174 fix flow 2022-09-06 11:21:58 +02:00
Andras Bacsai
86cc665b58 fix 2022-09-06 11:05:40 +02:00
Andras Bacsai
e26dd578ef fixes 2022-09-06 11:02:33 +02:00
Andras Bacsai
f1f3217052 testing new gh actions 2022-09-06 10:59:50 +02:00
Andras Bacsai
8f14fd89ef show not allowed list 2022-09-06 10:43:59 +02:00
Andras Bacsai
26e4d52a61 github actions update 2022-09-06 10:30:37 +02:00
Andras Bacsai
319c647147 updates 2022-09-06 10:28:13 +02:00
Andras Bacsai
f4cd93bd36 test allowedlist 2022-09-06 10:14:39 +02:00
Andras Bacsai
5a80bb1d2a fix: dockerfile 2022-09-06 10:14:34 +02:00
Andras Bacsai
1126dcacf5 update 2022-09-06 10:08:31 +02:00
Andras Bacsai
bdf9a73d19 fix 2022-09-06 09:42:13 +02:00
Andras Bacsai
1f73b83a79 testing traefik stuff 2022-09-06 09:30:13 +02:00
Andras Bacsai
73bd62c51e Merge pull request #582 from coollabsio/next
v3.9.0
2022-09-06 09:09:42 +02:00
Andras Bacsai
9acd5c94e8 Merge pull request #592 from kaname-png/ui-reworks
feat(routes): rework ui from login and register page
2022-09-06 08:42:26 +02:00
Andras Bacsai
6e85eac14b update package 2022-09-06 08:39:17 +02:00
Andras Bacsai
936baf676e cleanup logs 2022-09-06 08:01:04 +02:00
Andras Bacsai
867f06d813 chore: version++ 2022-09-06 07:57:39 +02:00
Kaname
7f9f440789 feat(routes): rework ui from login and register page 2022-09-05 19:21:19 +00:00
Andras Bacsai
5a15e64471 fix: update prisma
feat(beta): database branching
2022-09-05 15:41:32 +02:00
Andras Bacsai
c9aecd51f3 remove console logs 2022-09-05 13:15:58 +02:00
Andras Bacsai
6ca1d978d4 fix restart 2022-09-05 13:09:59 +02:00
Andras Bacsai
a18c73bd7c fix 2022-09-05 12:59:06 +02:00
Andras Bacsai
26d86cbcb5 debug more 2022-09-05 12:09:13 +02:00
Andras Bacsai
b24a5d9aca updates 2022-09-05 11:52:03 +02:00
Andras Bacsai
5305bc1ceb debug on 2022-09-05 10:17:52 +02:00
Andras Bacsai
a672f0f56c Merge pull request #590 from AshKyd/main
Small typo in global settings
2022-09-05 10:11:58 +02:00
Andras Bacsai
9ab5e13e8f fix: remote verification 2022-09-05 10:01:49 +02:00
Andras Bacsai
a49171f8cc ui: login page 2022-09-05 09:29:58 +02:00
Andras Bacsai
65d8dc412a fix: expose port is not required 2022-09-05 09:19:25 +02:00
Andras Bacsai
8600400632 fix: service deploymentEnabled 2022-09-05 09:15:32 +02:00
Andras Bacsai
11d10bee12 migrate: database_branches 2022-09-05 09:09:49 +02:00
Andras Bacsai
dbd16e8285 fix: fqdn or expose port required 2022-09-05 09:09:32 +02:00
Andras Bacsai
eb26787079 fix 2022-09-05 08:37:54 +02:00
Andras Bacsai
b0a7b1eb3d ui fix 2022-09-05 08:21:18 +02:00
Andras Bacsai
f994092d7f fix: repository link trim 2022-09-05 08:16:53 +02:00
Andras Bacsai
946d8e5be5 chore: version++ 2022-09-05 08:14:50 +02:00
Andras Bacsai
6d7c2ae74a fix: ssh pid agent name 2022-09-05 08:14:43 +02:00
Ash Kyd
1ba71b0b1b Small typo in global settings 2022-09-05 13:00:14 +10:00
Andras Bacsai
47c3af6a0e oops 2022-09-02 19:03:04 +00:00
Andras Bacsai
e5527a5aa5 temp 2022-09-02 18:46:24 +00:00
Andras Bacsai
9bb0dcd73f Testing different Dockerfile 2022-09-02 18:44:29 +00:00
Andras Bacsai
4159804052 update github actions 2022-09-02 17:56:33 +00:00
Andras Bacsai
adb27cf143 rc test 2022-09-02 17:52:45 +00:00
Andras Bacsai
a4879d854f github-actions: added rc release 2022-09-02 17:36:47 +00:00
Andras Bacsai
8b92dfb889 test: dockerfile 2022-09-02 15:36:24 +02:00
Andras Bacsai
eb4868cb6e test: native binary target 2022-09-02 15:35:24 +02:00
Andras Bacsai
e06e6e05ae disable taiga for now 2022-09-02 15:31:22 +02:00
Andras Bacsai
4fce4f81c7 fix: taiga 2022-09-02 15:11:21 +02:00
Andras Bacsai
ae11283574 fix 2022-09-02 14:33:52 +02:00
Andras Bacsai
15fc9aa483 fix nginxconf 2022-09-02 14:24:26 +02:00
Andras Bacsai
2ebfb8e6a9 fix 2022-09-02 14:14:20 +02:00
Andras Bacsai
d098ea675f feat: Taiga 2022-09-02 14:11:36 +02:00
Andras Bacsai
8ad152e5fc update autoupdate process 2022-09-02 13:12:45 +02:00
Andras Bacsai
14077fcf51 fix: database name on logs view 2022-09-02 09:05:36 +02:00
Andras Bacsai
b427573e19 fix: explainer component 2022-09-02 09:00:35 +02:00
Andras Bacsai
46268f0dcf fix: Settings missing id 2022-09-02 08:43:02 +02:00
Andras Bacsai
006c178eb1 fix: rename components + remove PR/MR deployment from public repos 2022-09-01 15:32:19 +02:00
Andras Bacsai
d63b20dabb revert git submodule 2022-09-01 15:02:49 +02:00
Andras Bacsai
fcf0a391ed fix: finally works! :) 2022-09-01 14:51:47 +02:00
Andras Bacsai
263b9c4b3e fix: traefik 2022-09-01 14:41:29 +02:00
Andras Bacsai
1dc7355952 fix: traefik appwrite 2022-09-01 14:36:07 +02:00
Andras Bacsai
67e4a72a28 fix: appwrite letsencrypt 2022-09-01 14:23:52 +02:00
Andras Bacsai
e6ea07f9b7 fix: exposedport on save 2022-09-01 14:01:31 +02:00
Andras Bacsai
44a691ae29 fix: UI + refactor 2022-09-01 13:58:44 +02:00
Andras Bacsai
290dbc43cb fix: gitlab webhooks 2022-09-01 13:58:27 +02:00
Andras Bacsai
219f1f9f3f feat: gitlab dual branch 2022-09-01 12:17:20 +02:00
Andras Bacsai
582170f26e feat: github allow fual branches 2022-09-01 12:14:06 +02:00
Andras Bacsai
4e2dad7720 feat: show elapsed time on running builds 2022-09-01 12:13:54 +02:00
Andras Bacsai
d002ec72ad update lock file 2022-09-01 11:20:36 +02:00
Andras Bacsai
f6bb14f7c4 ui: change tooltips and info boxes 2022-09-01 11:20:22 +02:00
Andras Bacsai
e1697848a5 fix: submodule 2022-09-01 09:30:24 +02:00
Andras Bacsai
4d48bba350 fix: ui 2022-08-31 15:40:07 +02:00
Andras Bacsai
a690cc5564 ui: fixes 2022-08-31 15:17:18 +02:00
Andras Bacsai
be16f76034 Merge pull request #583 from coollabsio/ui
UI updates
2022-08-31 15:07:01 +02:00
Andras Bacsai
ae4cf44728 Merge branch 'next' into ui 2022-08-31 15:06:53 +02:00
Andras Bacsai
4ac0df71b1 Merge pull request #581 from ArticaDev/main
More responsiveness improvements to UI
2022-08-31 15:06:23 +02:00
Andras Bacsai
dbd948867c fix: loading state on start 2022-08-31 15:03:36 +02:00
Andras Bacsai
a9b5cd6c31 fix: glitchtip things 2022-08-31 15:03:27 +02:00
Andras Bacsai
92f513d514 feat: restart application 2022-08-31 15:03:04 +02:00
Andras Bacsai
b239d21961 chore: version++ 2022-08-31 15:02:36 +02:00
Andras Bacsai
40e8dd4a8d feat: new service - weblate 2022-08-31 15:02:05 +02:00
Andras Bacsai
a667435ef2 fix contribution guide 2022-08-31 13:22:54 +02:00
Andras Bacsai
042b4e7587 typo 2022-08-31 11:39:20 +02:00
Andras Bacsai
c46a1b4a59 i18n converter 2022-08-31 11:36:34 +02:00
LL
e6035d5479 improving localdocker responsiveness 2022-08-31 01:05:44 -03:00
LL
008d090093 improving identity and access responsiveness 2022-08-31 01:03:27 -03:00
LL
6ff080c36b improving github page responsiveness 2022-08-31 01:02:33 -03:00
Andras Bacsai
086ca30323 fix: oh god Prisma 2022-08-30 15:15:57 +00:00
Andras Bacsai
17f3ecbbcb Merge pull request #579 from coollabsio/next
v3.8.8
2022-08-30 17:00:22 +02:00
Andras Bacsai
1f2c8c4ad2 chore: version++ 2022-08-30 14:59:16 +00:00
Andras Bacsai
bc03331c66 revert prisma 2022-08-30 14:58:52 +00:00
Andras Bacsai
ffd1711d4f Update package.json 2022-08-30 16:19:31 +02:00
Andras Bacsai
1cdbda1b6b Update common.ts 2022-08-30 16:13:07 +02:00
Andras Bacsai
fd15e5182d Update handlers.ts 2022-08-30 16:12:01 +02:00
Andras Bacsai
b4cc0fb0f3 Merge pull request #578 from coollabsio/next
v3.8.6
2022-08-30 15:22:20 +02:00
Andras Bacsai
fb2d03f2e1 fix: gitlab apps 2022-08-30 15:07:41 +02:00
Andras Bacsai
ab261aba49 Create CODE_OF_CONDUCT.md 2022-08-30 14:43:54 +02:00
Andras Bacsai
615bc8c5b9 Update docs link 2022-08-30 14:38:06 +02:00
Andras Bacsai
3fc98c8c1b fix: include 2022-08-30 09:20:28 +02:00
Andras Bacsai
a5cc14e885 fix: include 2022-08-30 09:19:34 +02:00
Andras Bacsai
fe4c0a4f28 Merge pull request #576 from luhagel/fix/db-destinations-redirect
Fix: Route to the correct path when creating destination from db config
2022-08-30 09:19:43 +02:00
Luca Hagel
a21678b5b8 Fix: Route to the correct path when creating destination from db config 2022-08-30 06:48:26 +02:00
Andras Bacsai
2272ea1139 chore: version++ 2022-08-29 15:49:37 +02:00
Andras Bacsai
e9affabf39 ui: fixes 2022-08-29 15:48:59 +02:00
Andras Bacsai
5a412000e1 Update dockerfile 2022-08-29 15:42:18 +02:00
Andras Bacsai
30312de4dd fix: compareVersions 2022-08-29 15:38:19 +02:00
Andras Bacsai
d9a775de16 Merge branch 'main' into next 2022-08-29 15:33:27 +02:00
Andras Bacsai
3d7cd78d0e Merge pull request #570 from ArticaDev/main
UI Improvements and minor responsiveness fixes 👍
2022-08-29 15:33:20 +02:00
Andras Bacsai
ccd550bbc4 Contribution guide + code refactor + package updates 2022-08-29 15:29:00 +02:00
Andras Bacsai
a6ffb5c61c fix: pr deployment 2022-08-27 20:42:04 +00:00
Lucas Lima
cb7659bb86 fixing sizes in register screen on mobile too :D 2022-08-27 12:58:03 +00:00
Lucas Lima
391f21f57e Merge remote-tracking branch 'upstream/main' 2022-08-27 12:56:21 +00:00
Andras Bacsai
fdce70937f Merge pull request #572 from coollabsio/next
v3.8.5
2022-08-27 13:46:10 +02:00
Andras Bacsai
2060619e5b fix: again 2022-08-27 11:42:51 +00:00
Andras Bacsai
d7fd1fc65b fix: next/nuxt deployment type 2022-08-27 11:39:04 +00:00
Andras Bacsai
6760d7e776 fix: whitelabeled icon 2022-08-27 10:33:03 +00:00
Andras Bacsai
d61187c836 fix: white labeled icon on navbar 2022-08-27 10:30:07 +00:00
Andras Bacsai
f9932f9fee fix: process 2022-08-27 10:22:58 +00:00
Andras Bacsai
c003e56c03 fix: typo 2022-08-27 10:22:36 +00:00
Andras Bacsai
4d94106bff fix: copy all files during install process 2022-08-27 10:22:03 +00:00
Andras Bacsai
3bd2d4f868 Merge branch 'main' into next 2022-08-27 08:29:32 +00:00
Andras Bacsai
75b37ea0dc chore: version++ 2022-08-27 08:24:38 +00:00
Andras Bacsai
c356d0455d Update staging-release.yml 2022-08-27 10:19:50 +02:00
Andras Bacsai
278f75e70c Update production-release.yml 2022-08-27 10:19:32 +02:00
Andras Bacsai
2c7c5a3dc3 Merge pull request #571 from coollabsio/next
v3.8.4
2022-08-27 09:57:16 +02:00
Andras Bacsai
806ffacedd remove unnecessary lines 2022-08-27 07:56:38 +00:00
Andras Bacsai
93246f80c4 fix: pr deployments + remove public gits 2022-08-27 07:46:54 +00:00
Andras Bacsai
e9723d3f22 fix: cleanup build cache as well 2022-08-27 07:46:30 +00:00
Andras Bacsai
6baec7277f fix: decrypt secrets 2022-08-27 07:46:20 +00:00
Andras Bacsai
d43554d290 fix: queue cleanup 2022-08-27 07:46:06 +00:00
Lucas Lima
1922e5e014 minor fixes to desktop grid flow 2022-08-27 02:10:35 +00:00
Lucas Lima
6c065a64fa fixing responsiveness issues with cleanup and restart buttons 2022-08-27 01:52:45 +00:00
Lucas Lima
3b6e5853d8 fixing login page unaligned with center on mobile 2022-08-27 01:52:24 +00:00
Andras Bacsai
463fee429b ui: fixes 2022-08-26 18:56:05 +00:00
Andras Bacsai
570b286ef9 fix: team switching 2022-08-26 18:44:51 +00:00
Andras Bacsai
4c5e71f33c fix: delete team while it is active 2022-08-26 18:39:13 +00:00
Andras Bacsai
3884483bca ui: dashbord fixes 2022-08-26 18:25:20 +00:00
Andras Bacsai
fc1a89cc63 fix: UI thinkgs 2022-08-26 17:11:54 +00:00
Andras Bacsai
c471eed808 Merge pull request #568 from coollabsio/next
v3.8.3
2022-08-26 18:01:55 +02:00
Andras Bacsai
35450dfc8f fix: secrets decryption 2022-08-26 16:00:36 +00:00
Andras Bacsai
5360c60f3d Merge pull request #565 from coollabsio/next
v3.8.2
2022-08-26 15:22:27 +02:00
Andras Bacsai
f60c640dc6 ui: fix 2022-08-26 13:28:54 +02:00
Andras Bacsai
91bb323e84 ui fixes 2022-08-26 13:22:43 +02:00
Andras Bacsai
cf9e122bd2 ui: fixes 2022-08-26 13:17:35 +02:00
Andras Bacsai
7528ca18d8 ui: fixes
feat: restart coolify button
2022-08-26 12:46:20 +02:00
Andras Bacsai
05ee35b6bc typo 2022-08-26 12:07:21 +02:00
Andras Bacsai
0e8b069781 ui: fine-tune 2022-08-26 10:29:45 +02:00
Andras Bacsai
3b95d7278d ui: dashboard fine-tunes 2022-08-26 10:22:50 +02:00
Andras Bacsai
7691706295 fix lockfile 2022-08-26 10:00:48 +02:00
Andras Bacsai
2af65ee946 Merge pull request #563 from kaname-png/rework-home
feat(ui): rework home UI and with responsive design
2022-08-26 10:00:06 +02:00
Andras Bacsai
279e1fd9c5 Merge branch 'next' into rework-home 2022-08-26 09:58:52 +02:00
Andras Bacsai
0f8f33e9fe remove unnecessary things 2022-08-26 09:49:17 +02:00
Andras Bacsai
12e91f1c6b fix 2022-08-26 09:33:50 +02:00
Andras Bacsai
f4f605867b hm 2022-08-26 09:09:41 +02:00
Andras Bacsai
ac40abba2e Update README.md 2022-08-26 09:05:26 +02:00
Andras Bacsai
224604f2e7 fixes 2022-08-26 09:01:48 +02:00
Andras Bacsai
ee4360de3a fix: better worker system 2022-08-26 06:29:06 +00:00
Kaname
0af59b9602 Merge remote-tracking branch 'upstream/next' into rework-home 2022-08-26 02:39:55 +00:00
Kaname
ff8fe68f14 feat(ui): rework home UI and with responsive design 2022-08-26 01:40:46 +00:00
Andras Bacsai
71c15e0ff5 fix: worker 2022-08-25 12:49:09 +00:00
Andras Bacsai
6be1fbacde Merge branch 'main' of https://github.com/coollabsio/coolify into next 2022-08-25 12:02:30 +00:00
Andras Bacsai
be594dd49e revert test 2022-08-25 12:44:04 +02:00
Andras Bacsai
6b65d435fb hmn 2022-08-25 12:32:50 +02:00
Andras Bacsai
0dc5212066 Revert things 2022-08-25 12:25:46 +02:00
Andras Bacsai
7c8ffd510e asd 2022-08-25 11:59:56 +02:00
Andras Bacsai
8e6423e873 hmm 2022-08-25 11:54:23 +02:00
Andras Bacsai
c99f74b351 debug 2022-08-25 11:49:19 +02:00
Andras Bacsai
57b462223c test 2022-08-25 11:09:38 +02:00
Andras Bacsai
055db97273 hm 2022-08-25 10:53:04 +02:00
Andras Bacsai
c80ebf73ee hmm 2022-08-25 10:47:58 +02:00
Andras Bacsai
9b613294ae debug cpu usage 2022-08-25 10:43:01 +02:00
Andras Bacsai
8cb73e1680 hmm 2022-08-25 10:37:21 +02:00
Andras Bacsai
27dfa24cfb remove debug 2022-08-25 10:29:08 +02:00
Andras Bacsai
c53f0dbb30 fix: high cpu usage 2022-08-25 10:28:32 +02:00
Andras Bacsai
db16a357e8 debug cpu usage 2022-08-25 10:17:07 +02:00
Andras Bacsai
01e71958b2 fix: build queue system 2022-08-25 10:04:46 +02:00
Andras Bacsai
f379519d40 chore: version++ 2022-08-24 14:50:34 +02:00
Andras Bacsai
54a09958d5 fix: never stop deplyo queue 2022-08-24 11:16:49 +02:00
Andras Bacsai
3421de06d5 Merge pull request #558 from coollabsio/next
v3.8.1
2022-08-24 10:54:53 +02:00
Andras Bacsai
e44eb01396 fix: dashboard for admins 2022-08-24 10:45:58 +02:00
Andras Bacsai
313143586b fix: cancelling jobs 2022-08-24 10:35:28 +02:00
Andras Bacsai
c8ae72893a fix: clear queue on cancelling jobs 2022-08-24 09:39:24 +02:00
Andras Bacsai
cbc3735ca0 fix: ui buttons 2022-08-23 13:03:01 +02:00
Andras Bacsai
31fdbdf8c9 Merge pull request #554 from coollabsio/next
v3.8.0
2022-08-23 11:41:52 +02:00
Andras Bacsai
2741d0ab2a fix: show build log start/end 2022-08-23 11:36:14 +02:00
Andras Bacsai
79b0187b58 fix: stream build logs 2022-08-23 11:29:25 +02:00
Andras Bacsai
bf5659d0e2 fix: dashboard for non-root users 2022-08-23 10:19:57 +02:00
Andras Bacsai
f6314cab69 chore: version++ 2022-08-23 10:11:58 +02:00
Andras Bacsai
4f5fe3d383 feat: Searxng service 2022-08-23 10:11:38 +02:00
Andras Bacsai
1c720d587c fix: batch secret = 2022-08-23 08:57:15 +02:00
Andras Bacsai
c46dc99224 fix: exposedPort checker 2022-08-23 08:50:28 +02:00
Andras Bacsai
5e43d4f20d update packages 2022-08-23 08:49:41 +02:00
Andras Bacsai
359434bfd3 fix: cancel build after 5 seconds 2022-08-23 08:49:32 +02:00
Andras Bacsai
e755a2d4ec fix: port checker 2022-08-22 19:30:09 +00:00
Ikechukwu Eze
7ed1ced521 insecure mode 2022-08-04 13:11:13 +02:00
Ikechukwu Eze
801b9c1483 Merge branch 'next' into edge-db 2022-07-31 19:54:30 +02:00
Ikechukwu Eze
3990bebca3 add to supported databases 2022-07-31 19:50:25 +02:00
Ikechukwu Eze
8a2de1001f first steps with ui 2022-07-31 19:50:03 +02:00
185 changed files with 12381 additions and 8070 deletions

View File

@@ -8,7 +8,7 @@
// Append -bullseye or -buster to pin to an OS version.
// Use -bullseye variants on local arm64/Apple Silicon.
"args": {
"VARIANT": "16-bullseye"
"VARIANT": "18-bullseye"
}
},
// Set *default* container specific settings.json values on container create.
@@ -20,12 +20,13 @@
"svelte.svelte-vscode",
"ardenivanov.svelte-intellisense",
"Prisma.prisma",
"bradlc.vscode-tailwindcss"
"bradlc.vscode-tailwindcss",
"waderyan.gitblame"
],
// Use 'forwardPorts' to make a list of ports inside the container available locally.
"forwardPorts": [3000],
"forwardPorts": [3000, 3001],
// Use 'postCreateCommand' to run commands after the container is created.
"postCreateCommand": "cp .env.template .env && pnpm install && pnpm db:push && pnpm db:seed",
"postCreateCommand": "cp apps/api/.env.example apps/api/.env && pnpm install && pnpm db:push && pnpm db:seed",
// Comment out to connect as root instead. More info: https://aka.ms/vscode-remote/containers/non-root.
"remoteUser": "node",
"features": {

View File

@@ -2,14 +2,14 @@ name: production-release
on:
release:
types: [published]
types: [released]
jobs:
making-something-cool:
runs-on: ubuntu-latest
arm64-build:
runs-on: [self-hosted, arm64]
steps:
- name: Checkout
uses: actions/checkout@v2
uses: actions/checkout@v3
- name: Set up QEMU
uses: docker/setup-qemu-action@v1
- name: Set up Docker Buildx
@@ -26,12 +26,60 @@ jobs:
uses: docker/build-push-action@v2
with:
context: .
platforms: linux/amd64,linux/arm64
platforms: linux/arm64
push: true
tags: coollabsio/coolify:latest,coollabsio/coolify:${{steps.package-version.outputs.current-version}}
cache-from: type=registry,ref=coollabsio/coolify:buildcache
cache-to: type=registry,ref=coollabsio/coolify:buildcache,mode=max
tags: coollabsio/coolify:${{steps.package-version.outputs.current-version}}-arm64
cache-from: type=registry,ref=coollabsio/coolify:buildcache-arm64
cache-to: type=registry,ref=coollabsio/coolify:buildcache-arm64,mode=max
amd64-build:
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v3
- name: Set up QEMU
uses: docker/setup-qemu-action@v2
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v2
- name: Login to DockerHub
uses: docker/login-action@v2
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Get current package version
uses: martinbeentjes/npm-get-version-action@v1.2.3
id: package-version
- name: Build and push
uses: docker/build-push-action@v3
with:
context: .
platforms: linux/amd64
push: true
tags: coollabsio/coolify:${{steps.package-version.outputs.current-version}}-amd64
cache-from: type=registry,ref=coollabsio/coolify:buildcache-amd64
cache-to: type=registry,ref=coollabsio/coolify:buildcache-amd64,mode=max
merge-manifest:
runs-on: ubuntu-latest
needs: [amd64-build, arm64-build]
steps:
- name: Checkout
uses: actions/checkout@v3
- name: Set up QEMU
uses: docker/setup-qemu-action@v2
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v2
- name: Login to DockerHub
uses: docker/login-action@v2
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Get current package version
uses: martinbeentjes/npm-get-version-action@v1.2.3
id: package-version
- name: Create & publish manifest
run: |
docker manifest create coollabsio/coolify:${{steps.package-version.outputs.current-version}} --amend coollabsio/coolify:${{steps.package-version.outputs.current-version}}-amd64 --amend coollabsio/coolify:${{steps.package-version.outputs.current-version}}-arm64
docker manifest push coollabsio/coolify:${{steps.package-version.outputs.current-version}}
- uses: sarisia/actions-status-discord@v1
if: always()
with:
webhook: ${{ secrets.DISCORD_WEBHOOK_DEV_CHANNEL }}
webhook: ${{ secrets.DISCORD_WEBHOOK_PROD_RELEASE_CHANNEL }}

90
.github/workflows/release-candidate.yml vendored Normal file
View File

@@ -0,0 +1,90 @@
name: release-candidate
on:
release:
types: [prereleased]
jobs:
arm64-making-something-cool:
runs-on: [self-hosted, arm64]
steps:
- name: Checkout
uses: actions/checkout@v3
with:
ref: "next"
- name: Set up QEMU
uses: docker/setup-qemu-action@v1
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v1
- name: Login to DockerHub
uses: docker/login-action@v1
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Get current package version
uses: martinbeentjes/npm-get-version-action@v1.2.3
id: package-version
- name: Build and push
uses: docker/build-push-action@v2
with:
context: .
platforms: linux/arm64
push: true
tags: coollabsio/coolify:${{github.event.release.name}}-arm64
cache-from: type=registry,ref=coollabsio/coolify:buildcache-rc-arm64
cache-to: type=registry,ref=coollabsio/coolify:buildcache-rc-arm64,mode=max
- uses: sarisia/actions-status-discord@v1
if: always()
with:
webhook: ${{ secrets.DISCORD_WEBHOOK_DEV_RELEASE_CHANNEL }}
amd64-making-something-cool:
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v3
with:
ref: "next"
- name: Set up QEMU
uses: docker/setup-qemu-action@v2
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v2
- name: Login to DockerHub
uses: docker/login-action@v2
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Get current package version
uses: martinbeentjes/npm-get-version-action@v1.2.3
id: package-version
- name: Build and push
uses: docker/build-push-action@v3
with:
context: .
platforms: linux/amd64
push: true
tags: coollabsio/coolify:${{github.event.release.name}}-amd64
cache-from: type=registry,ref=coollabsio/coolify:buildcache-rc-amd64
cache-to: type=registry,ref=coollabsio/coolify:buildcache-rc-amd64,mode=max
- uses: sarisia/actions-status-discord@v1
if: always()
with:
webhook: ${{ secrets.DISCORD_WEBHOOK_DEV_RELEASE_CHANNEL }}
merge-manifest-to-be-cool:
runs-on: ubuntu-latest
needs: [arm64-making-something-cool, amd64-making-something-cool]
steps:
- name: Checkout
uses: actions/checkout@v3
- name: Set up QEMU
uses: docker/setup-qemu-action@v2
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v2
- name: Login to DockerHub
uses: docker/login-action@v2
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Create & publish manifest
run: |
docker manifest create coollabsio/coolify:${{github.event.release.name}} --amend coollabsio/coolify:${{github.event.release.name}}-amd64 --amend coollabsio/coolify:${{github.event.release.name}}-arm64
docker manifest push coollabsio/coolify:${{github.event.release.name}}

View File

@@ -6,11 +6,13 @@ on:
- next
jobs:
staging-release:
runs-on: ubuntu-latest
arm64-making-something-cool:
runs-on: [self-hosted, arm64]
steps:
- name: Checkout
uses: actions/checkout@v2
uses: actions/checkout@v3
with:
ref: "next"
- name: Set up QEMU
uses: docker/setup-qemu-action@v1
- name: Set up Docker Buildx
@@ -20,16 +22,66 @@ jobs:
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Get current package version
uses: martinbeentjes/npm-get-version-action@v1.2.3
id: package-version
- name: Build and push
uses: docker/build-push-action@v2
with:
context: .
platforms: linux/arm64
push: true
tags: coollabsio/coolify:next-arm64
cache-from: type=registry,ref=coollabsio/coolify:buildcache-next-arm64
cache-to: type=registry,ref=coollabsio/coolify:buildcache-next-arm64,mode=max
amd64-making-something-cool:
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v3
with:
ref: "next"
- name: Set up QEMU
uses: docker/setup-qemu-action@v2
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v2
- name: Login to DockerHub
uses: docker/login-action@v2
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Get current package version
uses: martinbeentjes/npm-get-version-action@v1.2.3
id: package-version
- name: Build and push
uses: docker/build-push-action@v3
with:
context: .
platforms: linux/amd64
push: true
tags: coollabsio/coolify:next
cache-from: type=registry,ref=coollabsio/coolify:buildcache-next
cache-to: type=registry,ref=coollabsio/coolify:buildcache-next,mode=max
tags: coollabsio/coolify:next-amd64,coollabsio/coolify:next-test
cache-from: type=registry,ref=coollabsio/coolify:buildcache-next-amd64
cache-to: type=registry,ref=coollabsio/coolify:buildcache-next-amd64,mode=max
merge-manifest-to-be-cool:
runs-on: ubuntu-latest
needs: [arm64-making-something-cool, amd64-making-something-cool]
steps:
- name: Checkout
uses: actions/checkout@v3
- name: Set up QEMU
uses: docker/setup-qemu-action@v2
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v2
- name: Login to DockerHub
uses: docker/login-action@v2
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Create & publish manifest
run: |
docker manifest create coollabsio/coolify:next --amend coollabsio/coolify:next-amd64 --amend coollabsio/coolify:next-arm64
docker manifest push coollabsio/coolify:next
- uses: sarisia/actions-status-discord@v1
if: always()
with:
webhook: ${{ secrets.DISCORD_WEBHOOK_DEV_CHANNEL }}
webhook: ${{ secrets.DISCORD_WEBHOOK_DEV_RELEASE_CHANNEL }}

3
.gitignore vendored
View File

@@ -11,4 +11,5 @@ dist
client
apps/api/db/*.db
local-serve
apps/api/db/migration.db-journal
apps/api/db/migration.db-journal
apps/api/core*

94
CODE_OF_CONDUCT.md Normal file
View File

@@ -0,0 +1,94 @@
# Citizen Code of Conduct
## 1. Purpose
A primary goal of Coolify is to be inclusive to the largest number of contributors, with the most varied and diverse backgrounds possible. As such, we are committed to providing a friendly, safe and welcoming environment for all, regardless of gender, sexual orientation, ability, ethnicity, socioeconomic status, and religion (or lack thereof).
This code of conduct outlines our expectations for all those who participate in our community, as well as the consequences for unacceptable behavior.
We invite all those who participate in Coolify to help us create safe and positive experiences for everyone.
## 2. Open [Source/Culture/Tech] Citizenship
A supplemental goal of this Code of Conduct is to increase open [source/culture/tech] citizenship by encouraging participants to recognize and strengthen the relationships between our actions and their effects on our community.
Communities mirror the societies in which they exist and positive action is essential to counteract the many forms of inequality and abuses of power that exist in society.
If you see someone who is making an extra effort to ensure our community is welcoming, friendly, and encourages all participants to contribute to the fullest extent, we want to know.
## 3. Expected Behavior
The following behaviors are expected and requested of all community members:
* Participate in an authentic and active way. In doing so, you contribute to the health and longevity of this community.
* Exercise consideration and respect in your speech and actions.
* Attempt collaboration before conflict.
* Refrain from demeaning, discriminatory, or harassing behavior and speech.
* Be mindful of your surroundings and of your fellow participants. Alert community leaders if you notice a dangerous situation, someone in distress, or violations of this Code of Conduct, even if they seem inconsequential.
* Remember that community event venues may be shared with members of the public; please be respectful to all patrons of these locations.
## 4. Unacceptable Behavior
The following behaviors are considered harassment and are unacceptable within our community:
* Violence, threats of violence or violent language directed against another person.
* Sexist, racist, homophobic, transphobic, ableist or otherwise discriminatory jokes and language.
* Posting or displaying sexually explicit or violent material.
* Posting or threatening to post other people's personally identifying information ("doxing").
* Personal insults, particularly those related to gender, sexual orientation, race, religion, or disability.
* Inappropriate photography or recording.
* Inappropriate physical contact. You should have someone's consent before touching them.
* Unwelcome sexual attention. This includes, sexualized comments or jokes; inappropriate touching, groping, and unwelcomed sexual advances.
* Deliberate intimidation, stalking or following (online or in person).
* Advocating for, or encouraging, any of the above behavior.
* Sustained disruption of community events, including talks and presentations.
## 5. Weapons Policy
No weapons will be allowed at Coolify events, community spaces, or in other spaces covered by the scope of this Code of Conduct. Weapons include but are not limited to guns, explosives (including fireworks), and large knives such as those used for hunting or display, as well as any other item used for the purpose of causing injury or harm to others. Anyone seen in possession of one of these items will be asked to leave immediately, and will only be allowed to return without the weapon. Community members are further expected to comply with all state and local laws on this matter.
## 6. Consequences of Unacceptable Behavior
Unacceptable behavior from any community member, including sponsors and those with decision-making authority, will not be tolerated.
Anyone asked to stop unacceptable behavior is expected to comply immediately.
If a community member engages in unacceptable behavior, the community organizers may take any action they deem appropriate, up to and including a temporary ban or permanent expulsion from the community without warning (and without refund in the case of a paid event).
## 7. Reporting Guidelines
If you are subject to or witness unacceptable behavior, or have any other concerns, please notify a community organizer as soon as possible. hi@coollabs.io.
Additionally, community organizers are available to help community members engage with local law enforcement or to otherwise help those experiencing unacceptable behavior feel safe. In the context of in-person events, organizers will also provide escorts as desired by the person experiencing distress.
## 8. Addressing Grievances
If you feel you have been falsely or unfairly accused of violating this Code of Conduct, you should notify coollabsio with a concise description of your grievance. Your grievance will be handled in accordance with our existing governing policies.
## 9. Scope
We expect all community participants (contributors, paid or otherwise; sponsors; and other guests) to abide by this Code of Conduct in all community venues--online and in-person--as well as in all one-on-one communications pertaining to community business.
This code of conduct and its related procedures also applies to unacceptable behavior occurring outside the scope of community activities when such behavior has the potential to adversely affect the safety and well-being of community members.
## 10. Contact info
hi@coollabs.io
## 11. License and attribution
The Citizen Code of Conduct is distributed by [Stumptown Syndicate](http://stumptownsyndicate.org) under a [Creative Commons Attribution-ShareAlike license](http://creativecommons.org/licenses/by-sa/3.0/).
Portions of text derived from the [Django Code of Conduct](https://www.djangoproject.com/conduct/) and the [Geek Feminism Anti-Harassment Policy](http://geekfeminism.wikia.com/wiki/Conference_anti-harassment/Policy).
_Revision 2.3. Posted 6 March 2017._
_Revision 2.2. Posted 4 February 2016._
_Revision 2.1. Posted 23 June 2014._
_Revision 2.0, adopted by the [Stumptown Syndicate](http://stumptownsyndicate.org) board on 10 January 2013. Posted 17 March 2013._

View File

@@ -2,7 +2,6 @@
First of all, thank you for considering contributing to my project! It means a lot 💜.
Contribution guide is for v2, not applicable for v3
## 🙋 Want to help?
@@ -17,13 +16,17 @@ This is a little list of what you can do to help the project:
## 👋 Introduction
### Setup with github codespaces
### Setup with Github codespaces
If you have github codespaces enabled then you can just create a codespace and run `pnpm dev` to run your the dev environment. All the required dependencies and packages has been configured for you already.
### Setup with Gitpod
If you have a [Gitpod](https://gitpod.io), you can just create a workspace from this repository, run `pnpm install && pnpm db:push && pnpm db:seed` and then `pnpm dev`. All the required dependencies and packages has been configured for you already.
### Setup locally in your machine
> 🔴 At the moment, Coolify **doesn't support Windows**. You must use Linux or MacOS. 💡 Although windows users can use github codespaces for development
> 🔴 At the moment, Coolify **doesn't support Windows**. You must use Linux or MacOS. Consider using Gitpod or Github Codespaces.
#### Recommended Pull Request Guideline
@@ -31,21 +34,38 @@ If you have github codespaces enabled then you can just create a codespace and r
- Clone your fork repo to local
- Create a new branch
- Push to your fork repo
- Create a pull request: https://github.com/coollabsio/compare
- Create a pull request: https://github.com/coollabsio/coolify/compare
- Write a proper description
- Open the pull request to review against `next` branch
---
# How to start after you set up your local fork?
# 🧑‍💻 Developer contribution
## Technical skills required
Due to the lock file, this repository is best with [pnpm](https://pnpm.io). I recommend you try and use `pnpm` because it is cool and efficient!
- **Languages**: Node.js / Javascript / Typescript
- **Framework JS/TS**: [SvelteKit](https://kit.svelte.dev/) & [Fastify](https://www.fastify.io/)
- **Database ORM**: [Prisma.io](https://www.prisma.io/)
- **Docker Engine API**
You need to have [Docker Engine](https://docs.docker.com/engine/install/) installed locally.
---
#### Steps for local setup
## How to start after you set up your local fork?
1. Copy `.env.template` to `.env` and set the `COOLIFY_APP_ID` environment variable to something cool.
### Prerequisites
1. Due to the lock file, this repository is best with [pnpm](https://pnpm.io). I recommend you try and use `pnpm` because it is cool and efficient!
2. You need to have [Docker Engine](https://docs.docker.com/engine/install/) installed locally.
3. You need to have [Docker Compose Plugin](https://docs.docker.com/compose/install/compose-plugin/) installed locally.
4. You need to have [GIT LFS Support](https://git-lfs.github.com/) installed locally.
Optional:
4. To test Heroku buildpacks, you need [pack](https://github.com/buildpacks/pack) binary installed locally.
### Steps for local setup
1. Copy `apps/api/.env.template` to `apps/api/.env.template` and set the `COOLIFY_APP_ID` environment variable to something cool.
2. Install dependencies with `pnpm install`.
3. Need to create a local SQlite database with `pnpm db:push`.
@@ -54,28 +74,17 @@ You need to have [Docker Engine](https://docs.docker.com/engine/install/) instal
4. Seed the database with base entities with `pnpm db:seed`
5. You can start coding after starting `pnpm dev`.
## 🧑‍💻 Developer contribution
---
### Technical skills required
- **Languages**: Node.js / Javascript / Typescript
- **Framework JS/TS**: Svelte / SvelteKit
- **Database ORM**: Prisma.io
- **Docker Engine**
### Database migrations
## Database migrations
During development, if you change the database layout, you need to run `pnpm db:push` to migrate the database and create types for Prisma. You also need to restart the development process.
If the schema is finalized, you need to create a migration file with `pnpm db:migrate <nameOfMigration>` where `nameOfMigration` is given by you. Make it sense. :)
### Tricky parts
- BullMQ, the queue system Coolify uses, cannot be hot reloaded. So if you change anything in the files related to it, you need to restart the development process. I'm actively looking for a different queue/scheduler library. I'm open to discussion!
---
# How to add new services
## How to add new services
You can add any open-source and self-hostable software (service/application) to Coolify if the following statements are true:
@@ -95,14 +104,14 @@ There are 5 steps you should make on the backend side.
> I will use [Umami](https://umami.is/) as an example service.
### Create Prisma / database schema for the new service.
### Create Prisma / Database schema for the new service.
You only need to do this if you store passwords or any persistent configuration. Mostly it is required by all services, but there are some exceptions, like NocoDB.
Update Prisma schema in [prisma/schema.prisma](prisma/schema.prisma).
- Add new model with the new service name.
- Make a relationshup with `Service` model.
- Make a relationship with `Service` model.
- In the `Service` model, the name of the new field should be with low-capital.
- If the service needs a database, define a `publicPort` field to be able to make it's database public, example field name in case of PostgreSQL: `postgresqlPublicPort`. It should be a optional field.
@@ -110,13 +119,13 @@ If you are finished with the Prisma schema, you should update the database schem
> You must restart the running development environment to be able to use the new model
> If you use VSCode, you probably need to restart the `Typescript Language Server` to get the new types loaded in the running VSCode.
> If you use VSCode/TLS, you probably need to restart the `Typescript Language Server` to get the new types loaded in the running environment.
### Add supported versions
Supported versions are hardcoded into Coolify (for now).
You need to update `supportedServiceTypesAndVersions` function at [src/lib/components/common.ts](src/lib/components/common.ts). Example JSON:
You need to update `supportedServiceTypesAndVersions` function at [apps/api/src/lib/services/supportedVersions.ts](apps/api/src/lib/services/supportedVersions.ts). Example JSON:
```js
{
@@ -139,12 +148,12 @@ You need to update `supportedServiceTypesAndVersions` function at [src/lib/compo
}
```
### Update global functions
### Add required functions/properties
1. Add the new service to the `include` variable in [src/lib/database/services.ts](src/lib/database/services.ts), so it will be included in all places in the database queries where it is required.
1. Add the new service to the `includeServices` variable in [apps/api/src/lib/services/common.ts](apps/api/src/lib/services/common.ts), so it will be included in all places in the database queries where it is required.
```js
const include: Prisma.ServiceInclude = {
const include: any = {
destinationDocker: true,
persistentStorage: true,
serviceSecret: true,
@@ -158,7 +167,7 @@ const include: Prisma.ServiceInclude = {
};
```
2. Update the database update query with the new service type to `configureServiceType` function in [src/lib/database/services.ts](src/lib/database/services.ts). This function defines the automatically generated variables (passwords, users, etc.) and it's encryption process (if applicable).
2. Update the database update query with the new service type to `configureServiceType` function in [apps/api/src/lib/services/common.ts](apps/api/src/lib/services/common.ts). This function defines the automatically generated variables (passwords, users, etc.) and it's encryption process (if applicable).
```js
[...]
@@ -184,80 +193,46 @@ else if (type === 'umami') {
}
```
3. Add decryption process for configurations and passwords to `getService` function in [src/lib/database/services.ts](src/lib/database/services.ts)
3. Add field details to [apps/api/src/lib/services/serviceFields.ts](apps/api/src/lib/services/serviceFields.ts), so every component will know what to do with the values (decrypt/show it by default/readonly)
```js
if (body.umami?.postgresqlPassword)
body.umami.postgresqlPassword = decrypt(body.umami.postgresqlPassword);
if (body.umami?.hashSalt) body.umami.hashSalt = decrypt(body.umami.hashSalt);
export const umami = [{
name: 'postgresqlUser',
isEditable: false,
isLowerCase: false,
isNumber: false,
isBoolean: false,
isEncrypted: false
}]
```
4. Add service deletion query to `removeService` function in [src/lib/database/services.ts](src/lib/database/services.ts)
4. Add service deletion query to `removeService` function in [apps/api/src/lib/services/common.ts](apps/api/src/lib/services/common.ts)
### Create API endpoints.
5. Add start process for the new service in [apps/api/src/lib/services/handlers.ts](apps/api/src/lib/services/handlers.ts)
You need to add a new folder under [src/routes/services/[id]](src/routes/services/[id]) with the low-capital name of the service. You need 3 default files in that folder.
> See startUmamiService() function as example.
#### `index.json.ts`:
6. Add the newly added start process to `startService` in [apps/api/src/routes/api/v1/services/handlers.ts](apps/api/src/routes/api/v1/services/handlers.ts)
It has a POST endpoint that updates the service details in Coolify's database, such as name, url, other configurations, like passwords. It should look something like this:
```js
import { getUserDetails } from '$lib/common';
import * as db from '$lib/database';
import { ErrorHandler } from '$lib/database';
import type { RequestHandler } from '@sveltejs/kit';
export const post: RequestHandler = async (event) => {
const { status, body } = await getUserDetails(event);
if (status === 401) return { status, body };
const { id } = event.params;
let { name, fqdn } = await event.request.json();
if (fqdn) fqdn = fqdn.toLowerCase();
try {
await db.updateService({ id, fqdn, name });
return { status: 201 };
} catch (error) {
return ErrorHandler(error);
}
};
```
If it's necessary, you can create your own database update function, specifically for the new service.
#### `start.json.ts`
It has a POST endpoint that sets all the required secrets, persistent volumes, `docker-compose.yaml` file and sends a request to the specified docker engine.
You could also define an `HTTP` or `TCP` proxy for every other port that should be proxied to your server. (See `startHttpProxy` and `startTcpProxy` functions in [src/lib/haproxy/index.ts](src/lib/haproxy/index.ts))
#### `stop.json.ts`
It has a POST endpoint that stops the service and all dependent (TCP/HTTP proxies) containers. If publicPort is specified it also needs to cleanup it from the database.
## Frontend
1. You need to add a custom logo at [src/lib/components/svg/services/](src/lib/components/svg/services/) as a svelte component.
7. You need to add a custom logo at [apps/ui/src/lib/components/svg/services](apps/ui/src/lib/components/svg/services) as a svelte component and export it in [apps/ui/src/lib/components/svg/services/index.ts](apps/ui/src/lib/components/svg/services/index.ts)
SVG is recommended, but you can use PNG as well. It should have the `isAbsolute` variable with the suitable CSS classes, primarily for sizing and positioning.
2. You need to include it the logo at
8. You need to include it the logo at:
- [src/routes/services/index.svelte](src/routes/services/index.svelte) with `isAbsolute` in two places,
- [src/lib/components/ServiceLinks.svelte](src/lib/components/ServiceLinks.svelte) with `isAbsolute` and a link to the docs/main site of the service
- [src/routes/services/[id]/configuration/type.svelte](src/routes/services/[id]/configuration/type.svelte) with `isAbsolute`.
- [apps/ui/src/lib/components/svg/services/ServiceIcons.svelte](apps/ui/src/lib/components/svg/services/ServiceIcons.svelte) with `isAbsolute`.
- [apps/ui/src/routes/services/[id]/_ServiceLinks.svelte](apps/ui/src/routes/services/[id]/_ServiceLinks.svelte) with the link to the docs/main site of the service
3. By default the URL and the name frontend forms are included in [src/routes/services/[id]/\_Services/\_Services.svelte](src/routes/services/[id]/_Services/_Services.svelte).
9. By default the URL and the name frontend forms are included in [apps/ui/src/routes/services/[id]/_Services/_Services.svelte](apps/ui/src/routes/services/[id]/_Services/_Services.svelte).
If you need to show more details on the frontend, such as users/passwords, you need to add Svelte component to [src/routes/services/[id]/\_Services](src/routes/services/[id]/_Services) with an underscore. For example, see other files in that folder.
If you need to show more details on the frontend, such as users/passwords, you need to add Svelte component to [apps/ui/src/routes/services/[id]/_Services](apps/ui/src/routes/services/[id]/_Services) with an underscore.
> For example, see other [here](apps/ui/src/routes/services/[id]/_Services/_Umami.svelte).
You also need to add the new inputs to the `index.json.ts` file of the specific service, like for MinIO here: [src/routes/services/[id]/minio/index.json.ts](src/routes/services/[id]/minio/index.json.ts)
## 🌐 Translate the project
Good job! 👏
<!-- # 🌐 Translate the project
The project use [sveltekit-i18n](https://github.com/sveltekit-i18n/lib) to translate the project.
It follows the [ISO 639-1](https://en.wikipedia.org/wiki/List_of_ISO_639-1_codes) to name languages.
@@ -278,4 +253,4 @@ If your language doesn't appear in the [locales folder list](src/lib/locales/),
1. In `src/lib/locales/`, Copy paste `en.json` and rename it with your language (eg: `cz.json`).
2. In the [lang.json](src/lib/lang.json) file, add a line after the first bracket (`{`) with `"ISO of your language": "Language",` (eg: `"cz": "Czech",`).
3. Have fun translating!
3. Have fun translating! -->

108
CONTRIBUTION_NEW.md Normal file
View File

@@ -0,0 +1,108 @@
---
head:
- - meta
- name: description
content: Coolify - Databases
- - meta
- name: keywords
content: databases coollabs coolify
- - meta
- name: twitter:card
content: summary_large_image
- - meta
- name: twitter:site
content: '@andrasbacsai'
- - meta
- name: twitter:title
content: Coolify
- - meta
- name: twitter:description
content: An open-source & self-hostable Heroku / Netlify alternative.
- - meta
- name: twitter:image
content: https://cdn.coollabs.io/assets/coollabs/og-image-databases.png
- - meta
- property: og:type
content: website
- - meta
- property: og:url
content: https://coolify.io
- - meta
- property: og:title
content: Coolify
- - meta
- property: og:description
content: An open-source & self-hostable Heroku / Netlify alternative.
- - meta
- property: og:site_name
content: Coolify
- - meta
- property: og:image
content: https://cdn.coollabs.io/assets/coollabs/og-image-databases.png
---
# Contribution
First, thanks for considering to contribute to my project. It really means a lot! :)
You can ask for guidance anytime on our Discord server in the #contribution channel.
## Setup your development environment
### Github codespaces
If you have github codespaces enabled then you can just create a codespace and run `pnpm dev` to run your the dev environment. All the required dependencies and packages has been configured for you already.
### Gitpod
If you have a [Gitpod](https://gitpod.io), you can just create a workspace from this repository, run `pnpm install && pnpm db:push && pnpm db:seed` and then `pnpm dev`. All the required dependencies and packages has been configured for you already.
### Local Machine
> At the moment, Coolify `doesn't support Windows`. You must use `Linux` or `MacOS` or consider using Gitpod or Github Codespaces.
- Due to the lock file, this repository is best with [pnpm](https://pnpm.io). I recommend you try and use `pnpm` because it is cool and efficient!
- You need to have [Docker Engine](https://docs.docker.com/engine/install/) installed locally.
- You need to have [Docker Compose Plugin](https://docs.docker.com/compose/install/compose-plugin/) installed locally.
- You need to have [GIT LFS Support](https://git-lfs.github.com/) installed locally.
Optional:
- To test Heroku buildpacks, you need [pack](https://github.com/buildpacks/pack) binary installed locally.
### Inside a Docker container
`WIP`
## Setup Coolify
- Copy `apps/api/.env.template` to `apps/api/.env.template` and set the `COOLIFY_APP_ID` environment variable to something cool.
- `pnpm install` to install dependencies.
- `pnpm db:push` to o create a local SQlite database.
This will apply all migrations at `db/dev.db`.
- `pnpm db:seed` seed the database.
- `pnpm dev` start coding.
## Technical skills required
- **Languages**: Node.js / Javascript / Typescript
- **Framework JS/TS**: [SvelteKit](https://kit.svelte.dev/) & [Fastify](https://www.fastify.io/)
- **Database ORM**: [Prisma.io](https://www.prisma.io/)
- **Docker Engine API**
## Add a new service
### Which service is eligable to add to Coolify?
The following statements needs to be true:
- Self-hostable
- Open-source
- Maintained (I do not want to add software full of bugs)
### Create Prisma / Database schema for the new service.
All data that needs to be persist for a service should be saved to the database in `cleartext` or `encrypted`.
very password/api key/passphrase needs to be encrypted. If you are not sure, whether it should be encrypted or not, just encrypt it.
Update Prisma schema in [src/api/prisma/schema.prisma](https://github.com/coollabsio/coolify/blob/main/apps/api/prisma/schema.prisma).
- Add new model with the new service name.
- Make a relationship with `Service` model.
- In the `Service` model, the name of the new field should be with low-capital.
- If the service needs a database, define a `publicPort` field to be able to make it's database public, example field name in case of PostgreSQL: `postgresqlPublicPort`. It should be a optional field.

View File

@@ -1,30 +1,26 @@
FROM node:18-alpine3.16 as build
ARG PNPM_VERSION=7.11.0
ARG NPM_VERSION=8.19.1
FROM node:18-slim as build
WORKDIR /app
RUN apk add --no-cache curl
RUN curl -sL https://unpkg.com/@pnpm/self-installer | node
RUN apt update && apt -y install curl
RUN npm --no-update-notifier --no-fund --global install pnpm@${PNPM_VERSION}
COPY . .
RUN pnpm install
RUN pnpm build
# Production build
FROM node:18-alpine3.16
FROM node:18-slim
WORKDIR /app
ENV NODE_ENV production
ARG TARGETPLATFORM
ENV PRISMA_QUERY_ENGINE_BINARY=/app/prisma-engines/query-engine \
PRISMA_MIGRATION_ENGINE_BINARY=/app/prisma-engines/migration-engine \
PRISMA_INTROSPECTION_ENGINE_BINARY=/app/prisma-engines/introspection-engine \
PRISMA_FMT_BINARY=/app/prisma-engines/prisma-fmt \
PRISMA_CLI_QUERY_ENGINE_TYPE=binary \
PRISMA_CLIENT_ENGINE_TYPE=binary
COPY --from=coollabsio/prisma-engine:3.15 /prisma-engines/query-engine /prisma-engines/migration-engine /prisma-engines/introspection-engine /prisma-engines/prisma-fmt /app/prisma-engines/
RUN apk add --no-cache git git-lfs openssh-client curl jq cmake sqlite openssl psmisc
RUN curl -sL https://unpkg.com/@pnpm/self-installer | node
RUN apt update && apt -y install --no-install-recommends ca-certificates git git-lfs openssh-client curl jq cmake sqlite3 openssl psmisc python3
RUN apt-get clean autoclean && apt-get autoremove --yes && rm -rf /var/lib/{apt,dpkg,cache,log}/
RUN npm --no-update-notifier --no-fund --global install pnpm@${PNPM_VERSION}
RUN npm install -g npm@${PNPM_VERSION}
RUN mkdir -p ~/.docker/cli-plugins/
# https://download.docker.com/linux/static/stable/
@@ -45,4 +41,5 @@ COPY --from=build /app/docker-compose.yaml .
RUN pnpm install -p
EXPOSE 3000
ENV CHECKPOINT_DISABLE=1
CMD pnpm start

View File

@@ -16,7 +16,7 @@ If you have a new service / build pack you would like to add, raise an idea [her
## How to install
For more details goto the [docs](https://docs.coollabs.io/coolify/installation.html).
For more details goto the [docs](https://docs.coollabs.io/coolify/installation).
Installation is automated with the following command:
@@ -103,7 +103,7 @@ A fresh installation is necessary. v2 and v3 are not compatible with v1.
- Twitter: [@andrasbacsai](https://twitter.com/andrasbacsai)
- Telegram: [@andrasbacsai](https://t.me/andrasbacsai)
- Email: [andras@coollabs.io](mailto:andras@coollabs.io)
- Discord: [Invitation](https://discord.gg/6rDM4fkymF)
- Discord: [Invitation](https://coollabs.io/discord)
## Financial Contributors

View File

@@ -1,6 +1,7 @@
COOLIFY_APP_ID=
COOLIFY_SECRET_KEY="12341234123412341234123412341234 - 32 long"
COOLIFY_DATABASE_URL=
COOLIFY_APP_ID=local-dev
# 32 bits long secret key
COOLIFY_SECRET_KEY=12341234123412341234123412341234
COOLIFY_DATABASE_URL=file:../db/dev.db
COOLIFY_SENTRY_DSN=
COOLIFY_IS_ON=docker

View File

@@ -15,55 +15,57 @@
},
"dependencies": {
"@breejs/ts-worker": "2.0.0",
"@fastify/autoload": "5.2.0",
"@fastify/cookie": "7.3.1",
"@fastify/autoload": "5.3.1",
"@fastify/cookie": "8.1.0",
"@fastify/cors": "8.1.0",
"@fastify/env": "4.1.0",
"@fastify/jwt": "6.3.2",
"@fastify/static": "6.5.0",
"@iarna/toml": "2.2.5",
"@prisma/client": "3.15.2",
"@ladjs/graceful": "3.0.2",
"@prisma/client": "4.3.1",
"axios": "0.27.2",
"bcryptjs": "2.4.3",
"bree": "9.1.2",
"cabin": "9.1.2",
"compare-versions": "4.1.3",
"compare-versions": "5.0.1",
"cuid": "2.1.8",
"dayjs": "1.11.4",
"dockerode": "3.3.3",
"dayjs": "1.11.5",
"dockerode": "3.3.4",
"dotenv-extended": "2.9.0",
"fastify": "4.4.0",
"fastify-plugin": "4.1.0",
"execa": "6.1.0",
"fastify": "4.5.3",
"fastify-plugin": "4.2.1",
"generate-password": "1.7.0",
"get-port": "6.1.2",
"got": "12.3.1",
"got": "12.4.1",
"is-ip": "5.0.0",
"is-port-reachable": "4.0.0",
"js-yaml": "4.1.0",
"jsonwebtoken": "8.5.1",
"node-forge": "1.3.1",
"node-os-utils": "1.3.7",
"p-queue": "7.3.0",
"p-all": "4.0.0",
"p-throttle": "5.0.0",
"public-ip": "6.0.1",
"ssh-config": "4.1.6",
"strip-ansi": "7.0.1",
"unique-names-generator": "4.7.1"
},
"devDependencies": {
"@types/node": "18.6.5",
"@types/node": "18.7.15",
"@types/node-os-utils": "1.3.0",
"@typescript-eslint/eslint-plugin": "5.33.0",
"@typescript-eslint/parser": "5.33.0",
"esbuild": "0.15.0",
"eslint": "8.21.0",
"@typescript-eslint/eslint-plugin": "5.36.2",
"@typescript-eslint/parser": "5.36.2",
"esbuild": "0.15.7",
"eslint": "8.23.0",
"eslint-config-prettier": "8.5.0",
"eslint-plugin-prettier": "4.2.1",
"nodemon": "2.0.19",
"prettier": "2.7.1",
"prisma": "3.15.2",
"prisma": "4.3.1",
"rimraf": "3.0.2",
"tsconfig-paths": "4.1.0",
"typescript": "4.7.4"
"typescript": "4.8.2"
},
"prisma": {
"seed": "node prisma/seed.js"

View File

@@ -0,0 +1,13 @@
-- CreateTable
CREATE TABLE "Searxng" (
"id" TEXT NOT NULL PRIMARY KEY,
"secretKey" TEXT NOT NULL,
"redisPassword" TEXT NOT NULL,
"serviceId" TEXT NOT NULL,
"createdAt" DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updatedAt" DATETIME NOT NULL,
CONSTRAINT "Searxng_serviceId_fkey" FOREIGN KEY ("serviceId") REFERENCES "Service" ("id") ON DELETE RESTRICT ON UPDATE CASCADE
);
-- CreateIndex
CREATE UNIQUE INDEX "Searxng_serviceId_key" ON "Searxng"("serviceId");

View File

@@ -0,0 +1,29 @@
-- RedefineTables
PRAGMA foreign_keys=OFF;
CREATE TABLE "new_Setting" (
"id" TEXT NOT NULL PRIMARY KEY,
"fqdn" TEXT,
"isRegistrationEnabled" BOOLEAN NOT NULL DEFAULT false,
"dualCerts" BOOLEAN NOT NULL DEFAULT false,
"minPort" INTEGER NOT NULL DEFAULT 9000,
"maxPort" INTEGER NOT NULL DEFAULT 9100,
"proxyPassword" TEXT NOT NULL,
"proxyUser" TEXT NOT NULL,
"proxyHash" TEXT,
"isAutoUpdateEnabled" BOOLEAN NOT NULL DEFAULT false,
"isDNSCheckEnabled" BOOLEAN NOT NULL DEFAULT true,
"DNSServers" TEXT,
"isTraefikUsed" BOOLEAN NOT NULL DEFAULT true,
"createdAt" DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updatedAt" DATETIME NOT NULL,
"ipv4" TEXT,
"ipv6" TEXT,
"arch" TEXT,
"concurrentBuilds" INTEGER NOT NULL DEFAULT 1
);
INSERT INTO "new_Setting" ("DNSServers", "arch", "createdAt", "dualCerts", "fqdn", "id", "ipv4", "ipv6", "isAutoUpdateEnabled", "isDNSCheckEnabled", "isRegistrationEnabled", "isTraefikUsed", "maxPort", "minPort", "proxyHash", "proxyPassword", "proxyUser", "updatedAt") SELECT "DNSServers", "arch", "createdAt", "dualCerts", "fqdn", "id", "ipv4", "ipv6", "isAutoUpdateEnabled", "isDNSCheckEnabled", "isRegistrationEnabled", "isTraefikUsed", "maxPort", "minPort", "proxyHash", "proxyPassword", "proxyUser", "updatedAt" FROM "Setting";
DROP TABLE "Setting";
ALTER TABLE "new_Setting" RENAME TO "Setting";
CREATE UNIQUE INDEX "Setting_fqdn_key" ON "Setting"("fqdn");
PRAGMA foreign_key_check;
PRAGMA foreign_keys=ON;

View File

@@ -0,0 +1,24 @@
-- RedefineTables
PRAGMA foreign_keys=OFF;
CREATE TABLE "new_Build" (
"id" TEXT NOT NULL PRIMARY KEY,
"type" TEXT NOT NULL,
"applicationId" TEXT,
"destinationDockerId" TEXT,
"gitSourceId" TEXT,
"githubAppId" TEXT,
"gitlabAppId" TEXT,
"commit" TEXT,
"pullmergeRequestId" TEXT,
"forceRebuild" BOOLEAN NOT NULL DEFAULT false,
"sourceBranch" TEXT,
"branch" TEXT,
"status" TEXT DEFAULT 'queued',
"createdAt" DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updatedAt" DATETIME NOT NULL
);
INSERT INTO "new_Build" ("applicationId", "branch", "commit", "createdAt", "destinationDockerId", "gitSourceId", "githubAppId", "gitlabAppId", "id", "status", "type", "updatedAt") SELECT "applicationId", "branch", "commit", "createdAt", "destinationDockerId", "gitSourceId", "githubAppId", "gitlabAppId", "id", "status", "type", "updatedAt" FROM "Build";
DROP TABLE "Build";
ALTER TABLE "new_Build" RENAME TO "Build";
PRAGMA foreign_key_check;
PRAGMA foreign_keys=ON;

View File

@@ -0,0 +1,18 @@
-- CreateTable
CREATE TABLE "Weblate" (
"id" TEXT NOT NULL PRIMARY KEY,
"adminPassword" TEXT NOT NULL,
"postgresqlHost" TEXT NOT NULL,
"postgresqlPort" INTEGER NOT NULL,
"postgresqlUser" TEXT NOT NULL,
"postgresqlPassword" TEXT NOT NULL,
"postgresqlDatabase" TEXT NOT NULL,
"postgresqlPublicPort" INTEGER,
"serviceId" TEXT NOT NULL,
"createdAt" DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updatedAt" DATETIME NOT NULL,
CONSTRAINT "Weblate_serviceId_fkey" FOREIGN KEY ("serviceId") REFERENCES "Service" ("id") ON DELETE RESTRICT ON UPDATE CASCADE
);
-- CreateIndex
CREATE UNIQUE INDEX "Weblate_serviceId_key" ON "Weblate"("serviceId");

View File

@@ -0,0 +1,23 @@
-- CreateTable
CREATE TABLE "Taiga" (
"id" TEXT NOT NULL PRIMARY KEY,
"secretKey" TEXT NOT NULL,
"erlangSecret" TEXT NOT NULL,
"djangoAdminPassword" TEXT NOT NULL,
"djangoAdminUser" TEXT NOT NULL,
"rabbitMQUser" TEXT NOT NULL,
"rabbitMQPassword" TEXT NOT NULL,
"postgresqlHost" TEXT NOT NULL,
"postgresqlPort" INTEGER NOT NULL,
"postgresqlUser" TEXT NOT NULL,
"postgresqlPassword" TEXT NOT NULL,
"postgresqlDatabase" TEXT NOT NULL,
"postgresqlPublicPort" INTEGER,
"serviceId" TEXT NOT NULL,
"createdAt" DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updatedAt" DATETIME NOT NULL,
CONSTRAINT "Taiga_serviceId_fkey" FOREIGN KEY ("serviceId") REFERENCES "Service" ("id") ON DELETE RESTRICT ON UPDATE CASCADE
);
-- CreateIndex
CREATE UNIQUE INDEX "Taiga_serviceId_key" ON "Taiga"("serviceId");

View File

@@ -0,0 +1,22 @@
-- RedefineTables
PRAGMA foreign_keys=OFF;
CREATE TABLE "new_ApplicationSettings" (
"id" TEXT NOT NULL PRIMARY KEY,
"applicationId" TEXT NOT NULL,
"dualCerts" BOOLEAN NOT NULL DEFAULT false,
"debug" BOOLEAN NOT NULL DEFAULT false,
"previews" BOOLEAN NOT NULL DEFAULT false,
"autodeploy" BOOLEAN NOT NULL DEFAULT true,
"isBot" BOOLEAN NOT NULL DEFAULT false,
"isPublicRepository" BOOLEAN NOT NULL DEFAULT false,
"isDBBranching" BOOLEAN NOT NULL DEFAULT false,
"createdAt" DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updatedAt" DATETIME NOT NULL,
CONSTRAINT "ApplicationSettings_applicationId_fkey" FOREIGN KEY ("applicationId") REFERENCES "Application" ("id") ON DELETE RESTRICT ON UPDATE CASCADE
);
INSERT INTO "new_ApplicationSettings" ("applicationId", "autodeploy", "createdAt", "debug", "dualCerts", "id", "isBot", "isPublicRepository", "previews", "updatedAt") SELECT "applicationId", "autodeploy", "createdAt", "debug", "dualCerts", "id", "isBot", "isPublicRepository", "previews", "updatedAt" FROM "ApplicationSettings";
DROP TABLE "ApplicationSettings";
ALTER TABLE "new_ApplicationSettings" RENAME TO "ApplicationSettings";
CREATE UNIQUE INDEX "ApplicationSettings_applicationId_key" ON "ApplicationSettings"("applicationId");
PRAGMA foreign_key_check;
PRAGMA foreign_keys=ON;

View File

@@ -0,0 +1,20 @@
/*
Warnings:
- You are about to alter the column `time` on the `BuildLog` table. The data in that column could be lost. The data in that column will be cast from `Int` to `BigInt`.
*/
-- RedefineTables
PRAGMA foreign_keys=OFF;
CREATE TABLE "new_BuildLog" (
"id" TEXT NOT NULL PRIMARY KEY,
"applicationId" TEXT,
"buildId" TEXT NOT NULL,
"line" TEXT NOT NULL,
"time" BIGINT NOT NULL
);
INSERT INTO "new_BuildLog" ("applicationId", "buildId", "id", "line", "time") SELECT "applicationId", "buildId", "id", "line", "time" FROM "BuildLog";
DROP TABLE "BuildLog";
ALTER TABLE "new_BuildLog" RENAME TO "BuildLog";
PRAGMA foreign_key_check;
PRAGMA foreign_keys=ON;

View File

@@ -0,0 +1,20 @@
-- CreateTable
CREATE TABLE "ApplicationConnectedDatabase" (
"id" TEXT NOT NULL PRIMARY KEY,
"applicationId" TEXT NOT NULL,
"databaseId" TEXT,
"hostedDatabaseType" TEXT,
"hostedDatabaseHost" TEXT,
"hostedDatabasePort" INTEGER,
"hostedDatabaseName" TEXT,
"hostedDatabaseUser" TEXT,
"hostedDatabasePassword" TEXT,
"hostedDatabaseDBName" TEXT,
"createdAt" DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updatedAt" DATETIME NOT NULL,
CONSTRAINT "ApplicationConnectedDatabase_databaseId_fkey" FOREIGN KEY ("databaseId") REFERENCES "Database" ("id") ON DELETE SET NULL ON UPDATE CASCADE,
CONSTRAINT "ApplicationConnectedDatabase_applicationId_fkey" FOREIGN KEY ("applicationId") REFERENCES "Application" ("id") ON DELETE RESTRICT ON UPDATE CASCADE
);
-- CreateIndex
CREATE UNIQUE INDEX "ApplicationConnectedDatabase_applicationId_key" ON "ApplicationConnectedDatabase"("applicationId");

View File

@@ -0,0 +1,2 @@
-- AlterTable
ALTER TABLE "Setting" ADD COLUMN "isAPIDebuggingEnabled" BOOLEAN DEFAULT false;

View File

@@ -0,0 +1,13 @@
-- CreateTable
CREATE TABLE "DatabaseSecret" (
"id" TEXT NOT NULL PRIMARY KEY,
"name" TEXT NOT NULL,
"value" TEXT NOT NULL,
"createdAt" DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updatedAt" DATETIME NOT NULL,
"databaseId" TEXT NOT NULL,
CONSTRAINT "DatabaseSecret_databaseId_fkey" FOREIGN KEY ("databaseId") REFERENCES "Database" ("id") ON DELETE RESTRICT ON UPDATE CASCADE
);
-- CreateIndex
CREATE UNIQUE INDEX "DatabaseSecret_name_databaseId_key" ON "DatabaseSecret"("name", "databaseId");

View File

@@ -1,6 +1,6 @@
generator client {
provider = "prisma-client-js"
binaryTargets = ["native", "linux-musl"]
binaryTargets = ["native"]
}
datasource db {
@@ -11,6 +11,7 @@ datasource db {
model Setting {
id String @id @default(cuid())
fqdn String? @unique
isAPIDebuggingEnabled Boolean? @default(false)
isRegistrationEnabled Boolean @default(false)
dualCerts Boolean @default(false)
minPort Int @default(9000)
@@ -27,6 +28,7 @@ model Setting {
ipv4 String?
ipv6 String?
arch String?
concurrentBuilds Int @default(1)
}
model User {
@@ -116,6 +118,24 @@ model Application {
settings ApplicationSettings?
secrets Secret[]
teams Team[]
connectedDatabase ApplicationConnectedDatabase?
}
model ApplicationConnectedDatabase {
id String @id @default(cuid())
applicationId String @unique
databaseId String?
hostedDatabaseType String?
hostedDatabaseHost String?
hostedDatabasePort Int?
hostedDatabaseName String?
hostedDatabaseUser String?
hostedDatabasePassword String?
hostedDatabaseDBName String?
createdAt DateTime @default(now())
updatedAt DateTime @updatedAt
database Database? @relation(fields: [databaseId], references: [id])
application Application @relation(fields: [applicationId], references: [id])
}
model ApplicationSettings {
@@ -127,6 +147,7 @@ model ApplicationSettings {
autodeploy Boolean @default(true)
isBot Boolean @default(false)
isPublicRepository Boolean @default(false)
isDBBranching Boolean @default(false)
createdAt DateTime @default(now())
updatedAt DateTime @updatedAt
application Application @relation(fields: [applicationId], references: [id])
@@ -185,7 +206,7 @@ model BuildLog {
applicationId String?
buildId String
line String
time Int
time BigInt
}
model Build {
@@ -197,6 +218,9 @@ model Build {
githubAppId String?
gitlabAppId String?
commit String?
pullmergeRequestId String?
forceRebuild Boolean @default(false)
sourceBranch String?
branch String?
status String? @default("queued")
createdAt DateTime @default(now())
@@ -287,22 +311,36 @@ model GitlabApp {
}
model Database {
id String @id @default(cuid())
name String
publicPort Int?
defaultDatabase String?
type String?
version String?
dbUser String?
dbUserPassword String?
rootUser String?
rootUserPassword String?
destinationDockerId String?
createdAt DateTime @default(now())
updatedAt DateTime @updatedAt
destinationDocker DestinationDocker? @relation(fields: [destinationDockerId], references: [id])
settings DatabaseSettings?
teams Team[]
id String @id @default(cuid())
name String
publicPort Int?
defaultDatabase String?
type String?
version String?
dbUser String?
dbUserPassword String?
rootUser String?
rootUserPassword String?
destinationDockerId String?
createdAt DateTime @default(now())
updatedAt DateTime @updatedAt
destinationDocker DestinationDocker? @relation(fields: [destinationDockerId], references: [id])
settings DatabaseSettings?
teams Team[]
applicationConnectedDatabase ApplicationConnectedDatabase[]
databaseSecret DatabaseSecret[]
}
model DatabaseSecret {
id String @id @default(cuid())
name String
value String
createdAt DateTime @default(now())
updatedAt DateTime @updatedAt
databaseId String
database Database @relation(fields: [databaseId], references: [id])
@@unique([name, databaseId])
}
model DatabaseSettings {
@@ -327,23 +365,25 @@ model Service {
createdAt DateTime @default(now())
updatedAt DateTime @updatedAt
destinationDocker DestinationDocker? @relation(fields: [destinationDockerId], references: [id])
fider Fider?
ghost Ghost?
glitchTip GlitchTip?
hasura Hasura?
meiliSearch MeiliSearch?
minio Minio?
moodle Moodle?
plausibleAnalytics PlausibleAnalytics?
persistentStorage ServicePersistentStorage[]
serviceSecret ServiceSecret[]
umami Umami?
vscodeserver Vscodeserver?
wordpress Wordpress?
appwrite Appwrite?
teams Team[]
teams Team[]
fider Fider?
ghost Ghost?
glitchTip GlitchTip?
hasura Hasura?
meiliSearch MeiliSearch?
minio Minio?
moodle Moodle?
plausibleAnalytics PlausibleAnalytics?
umami Umami?
vscodeserver Vscodeserver?
wordpress Wordpress?
appwrite Appwrite?
searxng Searxng?
weblate Weblate?
taiga Taiga?
}
model PlausibleAnalytics {
@@ -545,3 +585,48 @@ model GlitchTip {
updatedAt DateTime @updatedAt
service Service @relation(fields: [serviceId], references: [id])
}
model Searxng {
id String @id @default(cuid())
secretKey String
redisPassword String
serviceId String @unique
createdAt DateTime @default(now())
updatedAt DateTime @updatedAt
service Service @relation(fields: [serviceId], references: [id])
}
model Weblate {
id String @id @default(cuid())
adminPassword String
postgresqlHost String
postgresqlPort Int
postgresqlUser String
postgresqlPassword String
postgresqlDatabase String
postgresqlPublicPort Int?
serviceId String @unique
createdAt DateTime @default(now())
updatedAt DateTime @updatedAt
service Service @relation(fields: [serviceId], references: [id])
}
model Taiga {
id String @id @default(cuid())
secretKey String
erlangSecret String
djangoAdminPassword String
djangoAdminUser String
rabbitMQUser String
rabbitMQPassword String
postgresqlHost String
postgresqlPort Int
postgresqlUser String
postgresqlPassword String
postgresqlDatabase String
postgresqlPublicPort Int?
serviceId String @unique
createdAt DateTime @default(now())
updatedAt DateTime @updatedAt
service Service @relation(fields: [serviceId], references: [id])
}

View File

@@ -17,7 +17,6 @@ const algorithm = 'aes-256-ctr';
async function main() {
// Enable registration for the first user
// Set initial HAProxy password
const settingsFound = await prisma.setting.findFirst({});
if (!settingsFound) {
await prisma.setting.create({
@@ -25,7 +24,8 @@ async function main() {
isRegistrationEnabled: true,
proxyPassword: encrypt(generatePassword()),
proxyUser: cuid(),
arch: process.arch
arch: process.arch,
DNSServers: '1.1.1.1,8.8.8.8'
}
});
} else {

View File

@@ -5,11 +5,10 @@ import env from '@fastify/env';
import cookie from '@fastify/cookie';
import path, { join } from 'path';
import autoLoad from '@fastify/autoload';
import { asyncExecShell, isDev, listSettings, prisma, version } from './lib/common';
import { asyncExecShell, createRemoteEngineConfiguration, getDomain, isDev, listSettings, prisma, version } from './lib/common';
import { scheduler } from './lib/scheduler';
import axios from 'axios';
import compareVersions from 'compare-versions';
import { compareVersions } from 'compare-versions';
import Graceful from '@ladjs/graceful'
declare module 'fastify' {
interface FastifyInstance {
config: {
@@ -27,129 +26,143 @@ declare module 'fastify' {
const port = isDev ? 3001 : 3000;
const host = '0.0.0.0';
const fastify = Fastify({
logger: false,
trustProxy: true
});
const schema = {
type: 'object',
required: ['COOLIFY_SECRET_KEY', 'COOLIFY_DATABASE_URL', 'COOLIFY_IS_ON'],
properties: {
COOLIFY_APP_ID: {
type: 'string',
},
COOLIFY_SECRET_KEY: {
type: 'string',
},
COOLIFY_DATABASE_URL: {
type: 'string',
default: 'file:../db/dev.db'
},
COOLIFY_SENTRY_DSN: {
type: 'string',
default: null
},
COOLIFY_IS_ON: {
type: 'string',
default: 'docker'
},
COOLIFY_WHITE_LABELED: {
type: 'string',
default: 'false'
},
COOLIFY_WHITE_LABELED_ICON: {
type: 'string',
default: null
},
COOLIFY_AUTO_UPDATE: {
type: 'string',
default: 'false'
},
}
};
const options = {
schema,
dotenv: true
};
fastify.register(env, options);
if (!isDev) {
fastify.register(serve, {
root: path.join(__dirname, './public'),
preCompressed: true
prisma.setting.findFirst().then(async (settings) => {
const fastify = Fastify({
logger: settings?.isAPIDebuggingEnabled || false,
trustProxy: true
});
fastify.setNotFoundHandler(async function (request, reply) {
if (request.raw.url && request.raw.url.startsWith('/api')) {
return reply.status(404).send({
success: false
});
const schema = {
type: 'object',
required: ['COOLIFY_SECRET_KEY', 'COOLIFY_DATABASE_URL', 'COOLIFY_IS_ON'],
properties: {
COOLIFY_APP_ID: {
type: 'string',
},
COOLIFY_SECRET_KEY: {
type: 'string',
},
COOLIFY_DATABASE_URL: {
type: 'string',
default: 'file:../db/dev.db'
},
COOLIFY_SENTRY_DSN: {
type: 'string',
default: null
},
COOLIFY_IS_ON: {
type: 'string',
default: 'docker'
},
COOLIFY_WHITE_LABELED: {
type: 'string',
default: 'false'
},
COOLIFY_WHITE_LABELED_ICON: {
type: 'string',
default: null
},
COOLIFY_AUTO_UPDATE: {
type: 'string',
default: 'false'
},
}
return reply.status(200).sendFile('index.html');
});
}
fastify.register(autoLoad, {
dir: join(__dirname, 'plugins')
});
fastify.register(autoLoad, {
dir: join(__dirname, 'routes')
});
};
fastify.register(cookie)
fastify.register(cors);
fastify.listen({ port, host }, async (err: any, address: any) => {
if (err) {
console.error(err);
process.exit(1);
}
console.log(`Coolify's API is listening on ${host}:${port}`);
await initServer();
await scheduler.start('deployApplication');
await scheduler.start('cleanupStorage');
await scheduler.start('cleanupPrismaEngines');
await scheduler.start('checkProxies');
// Check if no build is running
// Check for update
setInterval(async () => {
const { isAutoUpdateEnabled } = await prisma.setting.findFirst();
if (isAutoUpdateEnabled) {
const currentVersion = version;
const { data: versions } = await axios
.get(
`https://get.coollabs.io/versions.json`
, {
params: {
appId: process.env['COOLIFY_APP_ID'] || undefined,
version: currentVersion
}
})
const latestVersion = versions['coolify'].main.version;
const isUpdateAvailable = compareVersions(latestVersion, currentVersion);
if (isUpdateAvailable === 1) {
if (scheduler.workers.has('deployApplication')) {
scheduler.workers.get('deployApplication').postMessage("status:autoUpdater");
}
const options = {
schema,
dotenv: true
};
fastify.register(env, options);
if (!isDev) {
fastify.register(serve, {
root: path.join(__dirname, './public'),
preCompressed: true
});
fastify.setNotFoundHandler(async function (request, reply) {
if (request.raw.url && request.raw.url.startsWith('/api')) {
return reply.status(404).send({
success: false
});
}
}
}, isDev ? 5000 : 60000 * 15)
// Cleanup storage
setInterval(async () => {
if (scheduler.workers.has('deployApplication')) {
scheduler.workers.get('deployApplication').postMessage("status:cleanupStorage");
}
}, isDev ? 5000 : 60000 * 10)
scheduler.on('worker deleted', async (name) => {
if (name === 'autoUpdater' || name === 'cleanupStorage') {
if (!scheduler.workers.has('deployApplication')) await scheduler.start('deployApplication');
}
return reply.status(200).sendFile('index.html');
});
}
fastify.register(autoLoad, {
dir: join(__dirname, 'plugins')
});
await getArch();
await getIPAddress();
});
fastify.register(autoLoad, {
dir: join(__dirname, 'routes')
});
fastify.register(cookie)
fastify.register(cors);
fastify.addHook('onRequest', async (request, reply) => {
let allowedList = ['coolify:3000'];
const { ipv4, ipv6, fqdn } = await prisma.setting.findFirst({})
ipv4 && allowedList.push(`${ipv4}:3000`);
ipv6 && allowedList.push(ipv6);
fqdn && allowedList.push(getDomain(fqdn));
isDev && allowedList.push('localhost:3000') && allowedList.push('localhost:3001') && allowedList.push('host.docker.internal:3001');
const remotes = await prisma.destinationDocker.findMany({ where: { remoteEngine: true, remoteVerified: true } })
if (remotes.length > 0) {
remotes.forEach(remote => {
allowedList.push(`${remote.remoteIpAddress}:3000`);
})
}
if (!allowedList.includes(request.headers.host)) {
// console.log('not allowed', request.headers.host)
}
})
fastify.listen({ port, host }, async (err: any, address: any) => {
if (err) {
console.error(err);
process.exit(1);
}
console.log(`Coolify's API is listening on ${host}:${port}`);
await initServer();
const graceful = new Graceful({ brees: [scheduler] });
graceful.listen();
setInterval(async () => {
if (!scheduler.workers.has('deployApplication')) {
scheduler.run('deployApplication');
}
if (!scheduler.workers.has('infrastructure')) {
scheduler.run('infrastructure');
}
}, 2000)
// autoUpdater
setInterval(async () => {
scheduler.workers.has('infrastructure') && scheduler.workers.get('infrastructure').postMessage("action:autoUpdater")
}, isDev ? 5000 : 60000 * 15)
// cleanupStorage
setInterval(async () => {
scheduler.workers.has('infrastructure') && scheduler.workers.get('infrastructure').postMessage("action:cleanupStorage")
}, isDev ? 6000 : 60000 * 10)
// checkProxies
setInterval(async () => {
scheduler.workers.has('infrastructure') && scheduler.workers.get('infrastructure').postMessage("action:checkProxies")
}, 10000)
// cleanupPrismaEngines
// setInterval(async () => {
// scheduler.workers.has('infrastructure') && scheduler.workers.get('infrastructure').postMessage("action:cleanupPrismaEngines")
// }, 60000)
await Promise.all([
getArch(),
getIPAddress(),
configureRemoteDockers(),
])
});
})
async function getIPAddress() {
const { publicIpv4, publicIpv6 } = await import('public-ip')
try {
@@ -170,6 +183,12 @@ async function initServer() {
try {
await asyncExecShell(`docker network create --attachable coolify`);
} catch (error) { }
try {
const isOlder = compareVersions('3.8.1', version);
if (isOlder === 1) {
await prisma.build.updateMany({ where: { status: { in: ['running', 'queued'] } }, data: { status: 'failed' } });
}
} catch (error) { }
}
async function getArch() {
try {
@@ -180,4 +199,15 @@ async function getArch() {
} catch (error) { }
}
async function configureRemoteDockers() {
try {
const remoteDocker = await prisma.destinationDocker.findMany({
where: { remoteVerified: true, remoteEngine: true }
});
if (remoteDocker.length > 0) {
for (const docker of remoteDocker) {
await createRemoteEngineConfiguration(docker.id)
}
}
} catch (error) { }
}

View File

@@ -1,43 +0,0 @@
import axios from 'axios';
import compareVersions from 'compare-versions';
import { parentPort } from 'node:worker_threads';
import { asyncExecShell, asyncSleep, isDev, prisma, version } from '../lib/common';
(async () => {
if (parentPort) {
try {
const currentVersion = version;
const { data: versions } = await axios
.get(
`https://get.coollabs.io/versions.json`
, {
params: {
appId: process.env['COOLIFY_APP_ID'] || undefined,
version: currentVersion
}
})
const latestVersion = versions['coolify'].main.version;
const isUpdateAvailable = compareVersions(latestVersion, currentVersion);
if (isUpdateAvailable === 1) {
const activeCount = 0
if (activeCount === 0) {
if (!isDev) {
console.log(`Updating Coolify to ${latestVersion}.`);
await asyncExecShell(`docker pull coollabsio/coolify:${latestVersion}`);
await asyncExecShell(`env | grep COOLIFY > .env`);
await asyncExecShell(
`docker run --rm -tid --env-file .env -v /var/run/docker.sock:/var/run/docker.sock -v coolify-db coollabsio/coolify:${latestVersion} /bin/sh -c "env | grep COOLIFY > .env && echo 'TAG=${latestVersion}' >> .env && docker stop -t 0 coolify && docker rm coolify && docker compose up -d --force-recreate"`
);
} else {
console.log('Updating (not really in dev mode).');
}
}
}
} catch (error) {
console.log(error);
} finally {
await prisma.$disconnect();
}
} else process.exit(0);
})();

View File

@@ -1,96 +0,0 @@
import { parentPort } from 'node:worker_threads';
import { prisma, startTraefikTCPProxy, generateDatabaseConfiguration, startTraefikProxy, executeDockerCmd, listSettings } from '../lib/common';
import { checkContainer } from '../lib/docker';
(async () => {
if (parentPort) {
try {
const { arch } = await listSettings();
// Coolify Proxy local
const engine = '/var/run/docker.sock';
const localDocker = await prisma.destinationDocker.findFirst({
where: { engine, network: 'coolify' }
});
if (localDocker && localDocker.isCoolifyProxyUsed) {
// Remove HAProxy
const found = await checkContainer({ dockerId: localDocker.id, container: 'coolify-haproxy' });
if (found) {
await executeDockerCmd({
dockerId: localDocker.id,
command: `docker stop -t 0 coolify-haproxy && docker rm coolify-haproxy`
})
}
await startTraefikProxy(localDocker.id);
}
// TCP Proxies
const databasesWithPublicPort = await prisma.database.findMany({
where: { publicPort: { not: null } },
include: { settings: true, destinationDocker: true }
});
for (const database of databasesWithPublicPort) {
const { destinationDockerId, destinationDocker, publicPort, id } = database;
if (destinationDockerId && destinationDocker.isCoolifyProxyUsed) {
const { privatePort } = generateDatabaseConfiguration(database, arch);
// Remove HAProxy
const found = await checkContainer({
dockerId: localDocker.id, container: `haproxy-for-${publicPort}`
});
if (found) {
await executeDockerCmd({
dockerId: localDocker.id,
command: `docker stop -t 0 haproxy-for-${publicPort} && docker rm haproxy-for-${publicPort}`
})
}
await startTraefikTCPProxy(destinationDocker, id, publicPort, privatePort);
}
}
const wordpressWithFtp = await prisma.wordpress.findMany({
where: { ftpPublicPort: { not: null } },
include: { service: { include: { destinationDocker: true } } }
});
for (const ftp of wordpressWithFtp) {
const { service, ftpPublicPort } = ftp;
const { destinationDockerId, destinationDocker, id } = service;
if (destinationDockerId && destinationDocker.isCoolifyProxyUsed) {
// Remove HAProxy
const found = await checkContainer({ dockerId: localDocker.id, container: `haproxy-for-${ftpPublicPort}` });
if (found) {
await executeDockerCmd({
dockerId: localDocker.id,
command: `docker stop -t 0 haproxy -for-${ftpPublicPort} && docker rm haproxy-for-${ftpPublicPort}`
})
}
await startTraefikTCPProxy(destinationDocker, id, ftpPublicPort, 22, 'wordpressftp');
}
}
// HTTP Proxies
const minioInstances = await prisma.minio.findMany({
where: { publicPort: { not: null } },
include: { service: { include: { destinationDocker: true } } }
});
for (const minio of minioInstances) {
const { service, publicPort } = minio;
const { destinationDockerId, destinationDocker, id } = service;
if (destinationDockerId && destinationDocker.isCoolifyProxyUsed) {
// Remove HAProxy
const found = await checkContainer({ dockerId: localDocker.id, container: `${id}-${publicPort}` });
if (found) {
await executeDockerCmd({
dockerId: localDocker.id,
command: `docker stop -t 0 ${id}-${publicPort} && docker rm ${id}-${publicPort} `
})
}
await startTraefikTCPProxy(destinationDocker, id, publicPort, 9000);
}
}
} catch (error) {
} finally {
await prisma.$disconnect();
}
} else process.exit(0);
})();

View File

@@ -1,19 +0,0 @@
import { parentPort } from 'node:worker_threads';
import { asyncExecShell, isDev, prisma } from '../lib/common';
(async () => {
if (parentPort) {
if (!isDev) {
try {
const { stdout } = await asyncExecShell(`ps -ef | grep /app/prisma-engines/query-engine | grep -v grep | wc -l | xargs`)
if (stdout.trim() != null && stdout.trim() != '' && Number(stdout.trim()) > 1) {
await asyncExecShell(`killall -q -e /app/prisma-engines/query-engine -o 10m`)
}
} catch (error) {
console.log(error);
} finally {
await prisma.$disconnect();
}
}
} else process.exit(0);
})();

View File

@@ -1,60 +0,0 @@
import { parentPort } from 'node:worker_threads';
import { asyncExecShell, cleanupDockerStorage, executeDockerCmd, isDev, prisma, version } from '../lib/common';
(async () => {
if (parentPort) {
const destinationDockers = await prisma.destinationDocker.findMany();
let enginesDone = new Set()
for (const destination of destinationDockers) {
if (enginesDone.has(destination.engine) || enginesDone.has(destination.remoteIpAddress)) return
if (destination.engine) enginesDone.add(destination.engine)
if (destination.remoteIpAddress) enginesDone.add(destination.remoteIpAddress)
let lowDiskSpace = false;
try {
let stdout = null
if (!isDev) {
const output = await executeDockerCmd({ dockerId: destination.id, command: `CONTAINER=$(docker ps -lq | head -1) && docker exec $CONTAINER sh -c 'df -kPT /'` })
stdout = output.stdout;
} else {
const output = await asyncExecShell(
`df -kPT /`
);
stdout = output.stdout;
}
let lines = stdout.trim().split('\n');
let header = lines[0];
let regex =
/^Filesystem\s+|Type\s+|1024-blocks|\s+Used|\s+Available|\s+Capacity|\s+Mounted on\s*$/g;
const boundaries = [];
let match;
while ((match = regex.exec(header))) {
boundaries.push(match[0].length);
}
boundaries[boundaries.length - 1] = -1;
const data = lines.slice(1).map((line) => {
const cl = boundaries.map((boundary) => {
const column = boundary > 0 ? line.slice(0, boundary) : line;
line = line.slice(boundary);
return column.trim();
});
return {
capacity: Number.parseInt(cl[5], 10) / 100
};
});
if (data.length > 0) {
const { capacity } = data[0];
if (capacity > 0.8) {
lowDiskSpace = true;
}
}
} catch (error) {
console.log(error);
}
await cleanupDockerStorage(destination.id, lowDiskSpace, false)
}
await prisma.$disconnect();
} else process.exit(0);
})();

View File

@@ -4,210 +4,277 @@ import fs from 'fs/promises';
import yaml from 'js-yaml';
import { copyBaseConfigurationFiles, makeLabelForStandaloneApplication, saveBuildLog, setDefaultConfiguration } from '../lib/buildPacks/common';
import { createDirectories, decrypt, defaultComposeConfiguration, executeDockerCmd, getDomain, prisma } from '../lib/common';
import { createDirectories, decrypt, defaultComposeConfiguration, executeDockerCmd, getDomain, prisma, decryptApplication } from '../lib/common';
import * as importers from '../lib/importers';
import * as buildpacks from '../lib/buildPacks';
(async () => {
if (parentPort) {
const concurrency = 1
const PQueue = await import('p-queue');
const queue = new PQueue.default({ concurrency });
parentPort.on('message', async (message) => {
if (parentPort) {
if (message === 'error') throw new Error('oops');
if (message === 'cancel') {
parentPort.postMessage('cancelled');
return;
}
if (message === 'status:autoUpdater') {
parentPort.postMessage({ size: queue.size, pending: queue.pending, caller: 'autoUpdater' });
return;
}
if (message === 'status:cleanupStorage') {
parentPort.postMessage({ size: queue.size, pending: queue.pending, caller: 'cleanupStorage' });
return;
}
if (message === 'error') throw new Error('oops');
if (message === 'cancel') {
parentPort.postMessage('cancelled');
await prisma.$disconnect()
process.exit(0);
}
});
const pThrottle = await import('p-throttle')
const throttle = pThrottle.default({
limit: 1,
interval: 2000
});
await queue.add(async () => {
const {
id: applicationId,
repository,
name,
destinationDocker,
destinationDockerId,
gitSource,
build_id: buildId,
configHash,
fqdn,
projectId,
secrets,
phpModules,
type,
pullmergeRequestId = null,
sourceBranch = null,
settings,
persistentStorage,
pythonWSGI,
pythonModule,
pythonVariable,
denoOptions,
exposePort,
baseImage,
baseBuildImage,
deploymentType,
forceRebuild
} = message
let {
branch,
buildPack,
port,
installCommand,
buildCommand,
startCommand,
baseDirectory,
publishDirectory,
dockerFileLocation,
denoMainFile
} = message
const currentHash = crypto
.createHash('sha256')
.update(
JSON.stringify({
pythonWSGI,
pythonModule,
pythonVariable,
deploymentType,
denoOptions,
baseImage,
baseBuildImage,
buildPack,
port,
exposePort,
installCommand,
buildCommand,
startCommand,
secrets,
branch,
repository,
fqdn
})
)
.digest('hex');
try {
const { debug } = settings;
if (concurrency === 1) {
await prisma.build.updateMany({
where: {
status: { in: ['queued', 'running'] },
id: { not: buildId },
applicationId,
createdAt: { lt: new Date(new Date().getTime() - 10 * 1000) }
},
data: { status: 'failed' }
});
}
let imageId = applicationId;
let domain = getDomain(fqdn);
const volumes =
persistentStorage?.map((storage) => {
return `${applicationId}${storage.path.replace(/\//gi, '-')}:${buildPack !== 'docker' ? '/app' : ''
}${storage.path}`;
}) || [];
// Previews, we need to get the source branch and set subdomain
if (pullmergeRequestId) {
branch = sourceBranch;
domain = `${pullmergeRequestId}.${domain}`;
imageId = `${applicationId}-${pullmergeRequestId}`;
}
let deployNeeded = true;
let destinationType;
if (destinationDockerId) {
destinationType = 'docker';
}
if (destinationType === 'docker') {
await prisma.build.update({ where: { id: buildId }, data: { status: 'running' } });
const { workdir, repodir } = await createDirectories({ repository, buildId });
const configuration = await setDefaultConfiguration(message);
buildPack = configuration.buildPack;
port = configuration.port;
installCommand = configuration.installCommand;
startCommand = configuration.startCommand;
buildCommand = configuration.buildCommand;
publishDirectory = configuration.publishDirectory;
baseDirectory = configuration.baseDirectory;
dockerFileLocation = configuration.dockerFileLocation;
denoMainFile = configuration.denoMainFile;
const commit = await importers[gitSource.type]({
applicationId,
debug,
workdir,
repodir,
githubAppId: gitSource.githubApp?.id,
gitlabAppId: gitSource.gitlabApp?.id,
customPort: gitSource.customPort,
repository,
branch,
buildId,
apiUrl: gitSource.apiUrl,
htmlUrl: gitSource.htmlUrl,
projectId,
deployKeyId: gitSource.gitlabApp?.deployKeyId || null,
privateSshKey: decrypt(gitSource.gitlabApp?.privateSshKey) || null,
forPublic: gitSource.forPublic
});
if (!commit) {
throw new Error('No commit found?');
}
let tag = commit.slice(0, 7);
if (pullmergeRequestId) {
tag = `${commit.slice(0, 7)}-${pullmergeRequestId}`;
}
const th = throttle(async () => {
try {
const queuedBuilds = await prisma.build.findMany({ where: { status: { in: ['queued', 'running'] } }, orderBy: { createdAt: 'asc' } });
const { concurrentBuilds } = await prisma.setting.findFirst({})
if (queuedBuilds.length > 0) {
parentPort.postMessage({ deploying: true });
const concurrency = concurrentBuilds;
const pAll = await import('p-all');
const actions = []
for (const queueBuild of queuedBuilds) {
actions.push(async () => {
let application = await prisma.application.findUnique({ where: { id: queueBuild.applicationId }, include: { destinationDocker: true, gitSource: { include: { githubApp: true, gitlabApp: true } }, persistentStorage: true, secrets: true, settings: true, teams: true } })
let { id: buildId, type, sourceBranch = null, pullmergeRequestId = null, forceRebuild } = queueBuild
application = decryptApplication(application)
try {
await prisma.build.update({ where: { id: buildId }, data: { commit } });
} catch (err) {
console.log(err);
}
if (!pullmergeRequestId) {
if (configHash !== currentHash) {
deployNeeded = true;
if (configHash) {
await saveBuildLog({ line: 'Configuration changed.', buildId, applicationId });
}
} else {
deployNeeded = false;
if (queueBuild.status === 'running') {
await saveBuildLog({ line: 'Building halted, restarting...', buildId, applicationId: application.id });
}
const {
id: applicationId,
repository,
name,
destinationDocker,
destinationDockerId,
gitSource,
configHash,
fqdn,
projectId,
secrets,
phpModules,
settings,
persistentStorage,
pythonWSGI,
pythonModule,
pythonVariable,
denoOptions,
exposePort,
baseImage,
baseBuildImage,
deploymentType,
} = application
let {
branch,
buildPack,
port,
installCommand,
buildCommand,
startCommand,
baseDirectory,
publishDirectory,
dockerFileLocation,
denoMainFile
} = application
const currentHash = crypto
.createHash('sha256')
.update(
JSON.stringify({
pythonWSGI,
pythonModule,
pythonVariable,
deploymentType,
denoOptions,
baseImage,
baseBuildImage,
buildPack,
port,
exposePort,
installCommand,
buildCommand,
startCommand,
secrets,
branch,
repository,
fqdn
})
)
.digest('hex');
const { debug } = settings;
if (concurrency === 1) {
await prisma.build.updateMany({
where: {
status: { in: ['queued', 'running'] },
id: { not: buildId },
applicationId,
createdAt: { lt: new Date(new Date().getTime() - 10 * 1000) }
},
data: { status: 'failed' }
});
}
let imageId = applicationId;
let domain = getDomain(fqdn);
const volumes =
persistentStorage?.map((storage) => {
return `${applicationId}${storage.path.replace(/\//gi, '-')}:${buildPack !== 'docker' ? '/app' : ''
}${storage.path}`;
}) || [];
// Previews, we need to get the source branch and set subdomain
if (pullmergeRequestId) {
branch = sourceBranch;
domain = `${pullmergeRequestId}.${domain}`;
imageId = `${applicationId}-${pullmergeRequestId}`;
}
} else {
deployNeeded = true;
}
let imageFound = false;
try {
await executeDockerCmd({
dockerId: destinationDocker.id,
command: `docker image inspect ${applicationId}:${tag}`
})
imageFound = true;
} catch (error) {
//
}
await copyBaseConfigurationFiles(buildPack, workdir, buildId, applicationId, baseImage);
let deployNeeded = true;
let destinationType;
if (forceRebuild) deployNeeded = true
if (!imageFound || deployNeeded) {
// if (true) {
if (buildpacks[buildPack])
await buildpacks[buildPack]({
dockerId: destinationDocker.id,
buildId,
if (destinationDockerId) {
destinationType = 'docker';
}
if (destinationType === 'docker') {
await prisma.build.update({ where: { id: buildId }, data: { status: 'running' } });
const { workdir, repodir } = await createDirectories({ repository, buildId });
const configuration = await setDefaultConfiguration(application);
buildPack = configuration.buildPack;
port = configuration.port;
installCommand = configuration.installCommand;
startCommand = configuration.startCommand;
buildCommand = configuration.buildCommand;
publishDirectory = configuration.publishDirectory;
baseDirectory = configuration.baseDirectory;
dockerFileLocation = configuration.dockerFileLocation;
denoMainFile = configuration.denoMainFile;
const commit = await importers[gitSource.type]({
applicationId,
domain,
debug,
workdir,
repodir,
githubAppId: gitSource.githubApp?.id,
gitlabAppId: gitSource.gitlabApp?.id,
customPort: gitSource.customPort,
repository,
branch,
buildId,
apiUrl: gitSource.apiUrl,
htmlUrl: gitSource.htmlUrl,
projectId,
deployKeyId: gitSource.gitlabApp?.deployKeyId || null,
privateSshKey: decrypt(gitSource.gitlabApp?.privateSshKey) || null,
forPublic: gitSource.forPublic
});
if (!commit) {
throw new Error('No commit found?');
}
let tag = commit.slice(0, 7);
if (pullmergeRequestId) {
tag = `${commit.slice(0, 7)}-${pullmergeRequestId}`;
}
try {
await prisma.build.update({ where: { id: buildId }, data: { commit } });
} catch (err) { }
if (!pullmergeRequestId) {
if (configHash !== currentHash) {
deployNeeded = true;
if (configHash) {
await saveBuildLog({ line: 'Configuration changed.', buildId, applicationId });
}
} else {
deployNeeded = false;
}
} else {
deployNeeded = true;
}
let imageFound = false;
try {
await executeDockerCmd({
dockerId: destinationDocker.id,
command: `docker image inspect ${applicationId}:${tag}`
})
imageFound = true;
} catch (error) {
//
}
await copyBaseConfigurationFiles(buildPack, workdir, buildId, applicationId, baseImage);
if (forceRebuild) deployNeeded = true
if (!imageFound || deployNeeded) {
// if (true) {
if (buildpacks[buildPack])
await buildpacks[buildPack]({
dockerId: destinationDocker.id,
buildId,
applicationId,
domain,
name,
type,
pullmergeRequestId,
buildPack,
repository,
branch,
projectId,
publishDirectory,
debug,
commit,
tag,
workdir,
port: exposePort ? `${exposePort}:${port}` : port,
installCommand,
buildCommand,
startCommand,
baseDirectory,
secrets,
phpModules,
pythonWSGI,
pythonModule,
pythonVariable,
dockerFileLocation,
denoMainFile,
denoOptions,
baseImage,
baseBuildImage,
deploymentType
});
else {
await saveBuildLog({ line: `Build pack ${buildPack} not found`, buildId, applicationId });
throw new Error(`Build pack ${buildPack} not found.`);
}
} else {
await saveBuildLog({ line: 'Build image already available - no rebuild required.', buildId, applicationId });
}
try {
await executeDockerCmd({ dockerId: destinationDocker.id, command: `docker stop -t 0 ${imageId}` })
await executeDockerCmd({ dockerId: destinationDocker.id, command: `docker rm ${imageId}` })
} catch (error) {
//
}
const envs = [
`PORT=${port}`
];
if (secrets.length > 0) {
secrets.forEach((secret) => {
if (pullmergeRequestId) {
if (secret.isPRMRSecret) {
envs.push(`${secret.name}=${secret.value}`);
}
} else {
if (!secret.isPRMRSecret) {
envs.push(`${secret.name}=${secret.value}`);
}
}
});
}
await fs.writeFile(`${workdir}/.env`, envs.join('\n'));
const labels = makeLabelForStandaloneApplication({
applicationId,
fqdn,
name,
type,
pullmergeRequestId,
@@ -215,148 +282,90 @@ import * as buildpacks from '../lib/buildPacks';
repository,
branch,
projectId,
publishDirectory,
debug,
commit,
tag,
workdir,
port: exposePort ? `${exposePort}:${port}` : port,
commit,
installCommand,
buildCommand,
startCommand,
baseDirectory,
secrets,
phpModules,
pythonWSGI,
pythonModule,
pythonVariable,
dockerFileLocation,
denoMainFile,
denoOptions,
baseImage,
baseBuildImage,
deploymentType
publishDirectory
});
else {
await saveBuildLog({ line: `Build pack ${buildPack} not found`, buildId, applicationId });
throw new Error(`Build pack ${buildPack} not found.`);
}
} else {
await saveBuildLog({ line: 'Build image already available - no rebuild required.', buildId, applicationId });
}
try {
await executeDockerCmd({ dockerId: destinationDocker.id, command: `docker stop -t 0 ${imageId}` })
await executeDockerCmd({ dockerId: destinationDocker.id, command: `docker rm ${imageId}` })
} catch (error) {
//
}
const envs = [
`PORT=${port}`
];
if (secrets.length > 0) {
secrets.forEach((secret) => {
if (pullmergeRequestId) {
if (secret.isPRMRSecret) {
envs.push(`${secret.name}=${secret.value}`);
}
} else {
if (!secret.isPRMRSecret) {
envs.push(`${secret.name}=${secret.value}`);
}
let envFound = false;
try {
envFound = !!(await fs.stat(`${workdir}/.env`));
} catch (error) {
//
}
});
try {
await saveBuildLog({ line: 'Deployment started.', buildId, applicationId });
const composeVolumes = volumes.map((volume) => {
return {
[`${volume.split(':')[0]}`]: {
name: volume.split(':')[0]
}
};
});
const composeFile = {
version: '3.8',
services: {
[imageId]: {
image: `${applicationId}:${tag}`,
container_name: imageId,
volumes,
env_file: envFound ? [`${workdir}/.env`] : [],
labels,
depends_on: [],
expose: [port],
...(exposePort ? { ports: [`${exposePort}:${port}`] } : {}),
// logging: {
// driver: 'fluentd',
// },
...defaultComposeConfiguration(destinationDocker.network),
}
},
networks: {
[destinationDocker.network]: {
external: true
}
},
volumes: Object.assign({}, ...composeVolumes)
};
await fs.writeFile(`${workdir}/docker-compose.yml`, yaml.dump(composeFile));
await executeDockerCmd({ dockerId: destinationDocker.id, command: `docker compose --project-directory ${workdir} up -d` })
await saveBuildLog({ line: 'Deployment successful!', buildId, applicationId });
} catch (error) {
await saveBuildLog({ line: error, buildId, applicationId });
await prisma.build.updateMany({
where: { id: buildId, status: { in: ['queued', 'running'] } },
data: { status: 'failed' }
});
throw new Error(error);
}
await saveBuildLog({ line: 'Proxy will be updated shortly.', buildId, applicationId });
await prisma.build.update({ where: { id: buildId }, data: { status: 'success' } });
if (!pullmergeRequestId) await prisma.application.update({
where: { id: applicationId },
data: { configHash: currentHash }
});
}
}
await fs.writeFile(`${workdir}/.env`, envs.join('\n'));
const labels = makeLabelForStandaloneApplication({
applicationId,
fqdn,
name,
type,
pullmergeRequestId,
buildPack,
repository,
branch,
projectId,
port: exposePort ? `${exposePort}:${port}` : port,
commit,
installCommand,
buildCommand,
startCommand,
baseDirectory,
publishDirectory
});
let envFound = false;
try {
envFound = !!(await fs.stat(`${workdir}/.env`));
} catch (error) {
//
}
try {
await saveBuildLog({ line: 'Deployment started.', buildId, applicationId });
const composeVolumes = volumes.map((volume) => {
return {
[`${volume.split(':')[0]}`]: {
name: volume.split(':')[0]
}
};
});
const composeFile = {
version: '3.8',
services: {
[imageId]: {
image: `${applicationId}:${tag}`,
container_name: imageId,
volumes,
env_file: envFound ? [`${workdir}/.env`] : [],
labels,
depends_on: [],
expose: [port],
...(exposePort ? { ports: [`${exposePort}:${port}`] } : {}),
// logging: {
// driver: 'fluentd',
// },
...defaultComposeConfiguration(destinationDocker.network),
}
},
networks: {
[destinationDocker.network]: {
external: true
}
},
volumes: Object.assign({}, ...composeVolumes)
};
await fs.writeFile(`${workdir}/docker-compose.yml`, yaml.dump(composeFile));
await executeDockerCmd({ dockerId: destinationDocker.id, command: `docker compose --project-directory ${workdir} up -d` })
await saveBuildLog({ line: 'Deployment successful!', buildId, applicationId });
} catch (error) {
await saveBuildLog({ line: error, buildId, applicationId });
await prisma.build.update({
where: { id: message.build_id },
catch (error) {
await prisma.build.updateMany({
where: { id: buildId, status: { in: ['queued', 'running'] } },
data: { status: 'failed' }
});
throw new Error(error);
await saveBuildLog({ line: error, buildId, applicationId: application.id });
}
await saveBuildLog({ line: 'Proxy will be updated shortly.', buildId, applicationId });
await prisma.build.update({ where: { id: message.build_id }, data: { status: 'success' } });
if (!pullmergeRequestId) await prisma.application.update({
where: { id: applicationId },
data: { configHash: currentHash }
});
}
}
catch (error) {
await prisma.build.update({
where: { id: message.build_id },
data: { status: 'failed' }
});
await saveBuildLog({ line: error, buildId, applicationId });
} finally {
await prisma.$disconnect();
}
});
await prisma.$disconnect();
await pAll.default(actions, { concurrency })
}
} catch (error) {
console.log(error)
}
});
})
while (true) {
await th()
}
} else process.exit(0);
})();

View File

@@ -0,0 +1,229 @@
import { parentPort } from 'node:worker_threads';
import axios from 'axios';
import { compareVersions } from 'compare-versions';
import { asyncExecShell, cleanupDockerStorage, executeDockerCmd, isDev, prisma, startTraefikTCPProxy, generateDatabaseConfiguration, startTraefikProxy, listSettings, version, createRemoteEngineConfiguration } from '../lib/common';
async function autoUpdater() {
try {
const currentVersion = version;
const { data: versions } = await axios
.get(
`https://get.coollabs.io/versions.json`
, {
params: {
appId: process.env['COOLIFY_APP_ID'] || undefined,
version: currentVersion
}
})
const latestVersion = versions['coolify'].main.version;
const isUpdateAvailable = compareVersions(latestVersion, currentVersion);
if (isUpdateAvailable === 1) {
const activeCount = 0
if (activeCount === 0) {
if (!isDev) {
const { isAutoUpdateEnabled } = await prisma.setting.findFirst();
if (isAutoUpdateEnabled) {
await asyncExecShell(`docker pull coollabsio/coolify:${latestVersion}`);
await asyncExecShell(`env | grep COOLIFY > .env`);
await asyncExecShell(
`sed -i '/COOLIFY_AUTO_UPDATE=/cCOOLIFY_AUTO_UPDATE=${isAutoUpdateEnabled}' .env`
);
await asyncExecShell(
`docker run --rm -tid --env-file .env -v /var/run/docker.sock:/var/run/docker.sock -v coolify-db coollabsio/coolify:${latestVersion} /bin/sh -c "env | grep COOLIFY > .env && echo 'TAG=${latestVersion}' >> .env && docker stop -t 0 coolify && docker rm coolify && docker compose up -d --force-recreate"`
);
}
} else {
console.log('Updating (not really in dev mode).');
}
}
}
} catch (error) { }
}
async function checkProxies() {
try {
const { default: isReachable } = await import('is-port-reachable');
let portReachable;
const { arch, ipv4, ipv6 } = await listSettings();
// Coolify Proxy local
const engine = '/var/run/docker.sock';
const localDocker = await prisma.destinationDocker.findFirst({
where: { engine, network: 'coolify', isCoolifyProxyUsed: true }
});
if (localDocker) {
portReachable = await isReachable(80, { host: ipv4 || ipv6 })
if (!portReachable) {
await startTraefikProxy(localDocker.id);
}
}
// Coolify Proxy remote
const remoteDocker = await prisma.destinationDocker.findMany({
where: { remoteEngine: true, remoteVerified: true }
});
if (remoteDocker.length > 0) {
for (const docker of remoteDocker) {
if (docker.isCoolifyProxyUsed) {
portReachable = await isReachable(80, { host: docker.remoteIpAddress })
if (!portReachable) {
await startTraefikProxy(docker.id);
}
}
try {
await createRemoteEngineConfiguration(docker.id)
} catch (error) { }
}
}
// TCP Proxies
const databasesWithPublicPort = await prisma.database.findMany({
where: { publicPort: { not: null } },
include: { settings: true, destinationDocker: true }
});
for (const database of databasesWithPublicPort) {
const { destinationDockerId, destinationDocker, publicPort, id } = database;
if (destinationDockerId && destinationDocker.isCoolifyProxyUsed) {
const { privatePort } = generateDatabaseConfiguration(database, arch);
portReachable = await isReachable(publicPort, { host: destinationDocker.remoteIpAddress || ipv4 || ipv6 })
if (!portReachable) {
await startTraefikTCPProxy(destinationDocker, id, publicPort, privatePort);
}
}
}
const wordpressWithFtp = await prisma.wordpress.findMany({
where: { ftpPublicPort: { not: null } },
include: { service: { include: { destinationDocker: true } } }
});
for (const ftp of wordpressWithFtp) {
const { service, ftpPublicPort } = ftp;
const { destinationDockerId, destinationDocker, id } = service;
if (destinationDockerId && destinationDocker.isCoolifyProxyUsed) {
portReachable = await isReachable(ftpPublicPort, { host: destinationDocker.remoteIpAddress || ipv4 || ipv6 })
if (!portReachable) {
await startTraefikTCPProxy(destinationDocker, id, ftpPublicPort, 22, 'wordpressftp');
}
}
}
// HTTP Proxies
const minioInstances = await prisma.minio.findMany({
where: { publicPort: { not: null } },
include: { service: { include: { destinationDocker: true } } }
});
for (const minio of minioInstances) {
const { service, publicPort } = minio;
const { destinationDockerId, destinationDocker, id } = service;
if (destinationDockerId && destinationDocker.isCoolifyProxyUsed) {
portReachable = await isReachable(publicPort, { host: destinationDocker.remoteIpAddress || ipv4 || ipv6 })
if (!portReachable) {
await startTraefikTCPProxy(destinationDocker, id, publicPort, 9000);
}
}
}
} catch (error) {
}
}
async function cleanupPrismaEngines() {
if (!isDev) {
try {
const { stdout } = await asyncExecShell(`ps -ef | grep /app/prisma-engines/query-engine | grep -v grep | wc -l | xargs`)
if (stdout.trim() != null && stdout.trim() != '' && Number(stdout.trim()) > 1) {
await asyncExecShell(`killall -q -e /app/prisma-engines/query-engine -o 1m`)
}
} catch (error) { }
}
}
async function cleanupStorage() {
const destinationDockers = await prisma.destinationDocker.findMany();
let enginesDone = new Set()
for (const destination of destinationDockers) {
if (enginesDone.has(destination.engine) || enginesDone.has(destination.remoteIpAddress)) return
if (destination.engine) enginesDone.add(destination.engine)
if (destination.remoteIpAddress) enginesDone.add(destination.remoteIpAddress)
let lowDiskSpace = false;
try {
let stdout = null
if (!isDev) {
const output = await executeDockerCmd({ dockerId: destination.id, command: `CONTAINER=$(docker ps -lq | head -1) && docker exec $CONTAINER sh -c 'df -kPT /'` })
stdout = output.stdout;
} else {
const output = await asyncExecShell(
`df -kPT /`
);
stdout = output.stdout;
}
let lines = stdout.trim().split('\n');
let header = lines[0];
let regex =
/^Filesystem\s+|Type\s+|1024-blocks|\s+Used|\s+Available|\s+Capacity|\s+Mounted on\s*$/g;
const boundaries = [];
let match;
while ((match = regex.exec(header))) {
boundaries.push(match[0].length);
}
boundaries[boundaries.length - 1] = -1;
const data = lines.slice(1).map((line) => {
const cl = boundaries.map((boundary) => {
const column = boundary > 0 ? line.slice(0, boundary) : line;
line = line.slice(boundary);
return column.trim();
});
return {
capacity: Number.parseInt(cl[5], 10) / 100
};
});
if (data.length > 0) {
const { capacity } = data[0];
if (capacity > 0.8) {
lowDiskSpace = true;
}
}
} catch (error) { }
await cleanupDockerStorage(destination.id, lowDiskSpace, false)
}
}
(async () => {
let status = {
cleanupStorage: false,
autoUpdater: false
}
if (parentPort) {
parentPort.on('message', async (message) => {
if (parentPort) {
if (message === 'error') throw new Error('oops');
if (message === 'cancel') {
parentPort.postMessage('cancelled');
process.exit(1);
}
if (message === 'action:cleanupStorage') {
if (!status.autoUpdater) {
status.cleanupStorage = true
await cleanupStorage();
status.cleanupStorage = false
}
return;
}
if (message === 'action:cleanupPrismaEngines') {
await cleanupPrismaEngines();
return;
}
if (message === 'action:checkProxies') {
await checkProxies();
return;
}
if (message === 'action:autoUpdater') {
if (!status.cleanupStorage) {
status.autoUpdater = true
await autoUpdater();
status.autoUpdater = false
}
return;
}
}
});
} else process.exit(0);
})();

View File

@@ -89,6 +89,22 @@ export function setDefaultBaseImage(buildPack: string | null, deploymentType: st
}
];
const phpVersions = [
{
value: 'webdevops/php-apache:8.2',
label: 'webdevops/php-apache:8.2'
},
{
value: 'webdevops/php-nginx:8.2',
label: 'webdevops/php-nginx:8.2'
},
{
value: 'webdevops/php-apache:8.1',
label: 'webdevops/php-apache:8.1'
},
{
value: 'webdevops/php-nginx:8.1',
label: 'webdevops/php-nginx:8.1'
},
{
value: 'webdevops/php-apache:8.0',
label: 'webdevops/php-apache:8.0'
@@ -145,6 +161,22 @@ export function setDefaultBaseImage(buildPack: string | null, deploymentType: st
value: 'webdevops/php-nginx:5.6',
label: 'webdevops/php-nginx:5.6'
},
{
value: 'webdevops/php-apache:8.2-alpine',
label: 'webdevops/php-apache:8.2-alpine'
},
{
value: 'webdevops/php-nginx:8.2-alpine',
label: 'webdevops/php-nginx:8.2-alpine'
},
{
value: 'webdevops/php-apache:8.1-alpine',
label: 'webdevops/php-apache:8.1-alpine'
},
{
value: 'webdevops/php-nginx:8.1-alpine',
label: 'webdevops/php-nginx:8.1-alpine'
},
{
value: 'webdevops/php-apache:8.0-alpine',
label: 'webdevops/php-apache:8.0-alpine'
@@ -305,11 +337,11 @@ export function setDefaultBaseImage(buildPack: string | null, deploymentType: st
payload.baseImage = 'denoland/deno:latest';
}
if (buildPack === 'php') {
payload.baseImage = 'webdevops/php-apache:8.0-alpine';
payload.baseImage = 'webdevops/php-apache:8.2-alpine';
payload.baseImages = phpVersions;
}
if (buildPack === 'laravel') {
payload.baseImage = 'webdevops/php-apache:8.0-alpine';
payload.baseImage = 'webdevops/php-apache:8.2-alpine';
payload.baseBuildImage = 'node:18';
payload.baseBuildImages = nodeVersions;
}
@@ -512,7 +544,6 @@ export async function copyBaseConfigurationFiles(
);
}
} catch (error) {
console.log(error);
throw new Error(error);
}
}
@@ -541,9 +572,6 @@ export async function buildImage({
} else {
await saveBuildLog({ line: `Building image started.`, buildId, applicationId });
}
if (debug) {
await saveBuildLog({ line: `\n###############\nIMPORTANT: Due to some issues during implementing Remote Docker Engine, the builds logs are not streamed at the moment - but will be soon! You will see the full build log when the build is finished!\n###############`, buildId, applicationId });
}
if (!debug && isCache) {
await saveBuildLog({
line: `Debug turned off. To see more details, allow it in the configuration.`,
@@ -553,54 +581,11 @@ export async function buildImage({
}
const dockerFile = isCache ? `${dockerFileLocation}-cache` : `${dockerFileLocation}`
const cache = `${applicationId}:${tag}${isCache ? '-cache' : ''}`
const { stderr } = await executeDockerCmd({ dockerId, command: `docker build --progress plain -f ${workdir}/${dockerFile} -t ${cache} ${workdir}` })
if (debug) {
const array = stderr.split('\n')
for (const line of array) {
if (line !== '\n') {
await saveBuildLog({
line: `${line.replace('\n', '')}`,
buildId,
applicationId
});
}
}
await executeDockerCmd({ debug, buildId, applicationId, dockerId, command: `docker build --progress plain -f ${workdir}/${dockerFile} -t ${cache} ${workdir}` })
const { status } = await prisma.build.findUnique({ where: { id: buildId } })
if (status === 'canceled') {
throw new Error('Deployment canceled.')
}
// await new Promise((resolve, reject) => {
// const command = spawn(`docker`, ['build', '-f', `${workdir}${dockerFile}`, '-t', `${cache}`,`${workdir}`], {
// env: {
// DOCKER_HOST: 'ssh://root@95.217.178.202',
// DOCKER_BUILDKIT: '1'
// }
// });
// command.stdout.on('data', function (data) {
// console.log('stdout: ' + data);
// });
// command.stderr.on('data', function (data) {
// console.log('stderr: ' + data);
// });
// command.on('error', function (error) {
// console.log(error)
// reject(error)
// })
// command.on('exit', function (code) {
// console.log('exit code: ' + code);
// resolve(code)
// });
// })
// console.log({ stdout, stderr })
// const stream = await docker.engine.buildImage(
// { src: ['.'], context: workdir },
// {
// dockerfile: isCache ? `${dockerFileLocation}-cache` : dockerFileLocation,
// t: `${applicationId}:${tag}${isCache ? '-cache' : ''}`
// }
// );
// await streamEvents({ stream, docker, buildId, applicationId, debug });
if (isCache) {
await saveBuildLog({ line: `Building cache image successful.`, buildId, applicationId });
} else {
@@ -717,11 +702,10 @@ export async function buildCacheImageWithNode(data, imageForBuild) {
if (isPnpm) {
Dockerfile.push('RUN curl -f https://get.pnpm.io/v6.16.js | node - add --global pnpm@7');
}
Dockerfile.push(`COPY .${baseDirectory || ''} ./`);
if (installCommand) {
Dockerfile.push(`COPY .${baseDirectory || ''}/package.json ./`);
Dockerfile.push(`RUN ${installCommand}`);
}
Dockerfile.push(`COPY .${baseDirectory || ''} ./`);
Dockerfile.push(`RUN ${buildCommand}`);
await fs.writeFile(`${workdir}/Dockerfile-cache`, Dockerfile.join('\n'));
await buildImage({ ...data, isCache: true });

File diff suppressed because it is too large Load Diff

View File

@@ -76,7 +76,6 @@ export async function removeContainer({
await executeDockerCmd({ dockerId, command: `docker rm ${id}` })
}
} catch (error) {
console.log(error);
throw error;
}
}

View File

@@ -2,54 +2,29 @@ import Bree from 'bree';
import path from 'path';
import Cabin from 'cabin';
import TSBree from '@breejs/ts-worker';
import { isDev } from './common';
export const isDev = process.env.NODE_ENV === 'development';
Bree.extend(TSBree);
const options: any = {
defaultExtension: 'js',
// logger: new Cabin(),
logger: false,
workerMessageHandler: async ({ name, message }) => {
if (name === 'deployApplication') {
if (message.pending === 0 && message.size === 0) {
if (message.caller === 'autoUpdater') {
if (!scheduler.workers.has('autoUpdater')) {
await scheduler.stop('deployApplication');
await scheduler.run('autoUpdater')
}
}
if (message.caller === 'cleanupStorage') {
if (!scheduler.workers.has('cleanupStorage')) {
await scheduler.run('cleanupStorage')
}
}
if (name === 'deployApplication' && message?.deploying) {
if (scheduler.workers.has('autoUpdater') || scheduler.workers.has('cleanupStorage')) {
scheduler.workers.get('deployApplication').postMessage('cancel')
}
}
},
jobs: [
{
name: 'deployApplication'
},
{
name: 'cleanupStorage',
},
{
name: 'cleanupPrismaEngines',
interval: '1m'
},
{
name: 'checkProxies',
interval: '10s'
},
{
name: 'autoUpdater',
}
{ name: 'infrastructure' },
{ name: 'deployApplication' },
],
};
if (isDev) options.root = path.join(__dirname, '../jobs');
export const scheduler = new Bree(options);

View File

@@ -0,0 +1,383 @@
import cuid from 'cuid';
import { encrypt, generatePassword, prisma } from '../common';
export const includeServices: any = {
destinationDocker: true,
persistentStorage: true,
serviceSecret: true,
minio: true,
plausibleAnalytics: true,
vscodeserver: true,
wordpress: true,
ghost: true,
meiliSearch: true,
umami: true,
hasura: true,
fider: true,
moodle: true,
appwrite: true,
glitchTip: true,
searxng: true,
weblate: true,
taiga: true
};
export async function configureServiceType({
id,
type
}: {
id: string;
type: string;
}): Promise<void> {
if (type === 'plausibleanalytics') {
const password = encrypt(generatePassword({}));
const postgresqlUser = cuid();
const postgresqlPassword = encrypt(generatePassword({}));
const postgresqlDatabase = 'plausibleanalytics';
const secretKeyBase = encrypt(generatePassword({ length: 64 }));
await prisma.service.update({
where: { id },
data: {
type,
plausibleAnalytics: {
create: {
postgresqlDatabase,
postgresqlUser,
postgresqlPassword,
password,
secretKeyBase
}
}
}
});
} else if (type === 'nocodb') {
await prisma.service.update({
where: { id },
data: { type }
});
} else if (type === 'minio') {
const rootUser = cuid();
const rootUserPassword = encrypt(generatePassword({}));
await prisma.service.update({
where: { id },
data: { type, minio: { create: { rootUser, rootUserPassword } } }
});
} else if (type === 'vscodeserver') {
const password = encrypt(generatePassword({}));
await prisma.service.update({
where: { id },
data: { type, vscodeserver: { create: { password } } }
});
} else if (type === 'wordpress') {
const mysqlUser = cuid();
const mysqlPassword = encrypt(generatePassword({}));
const mysqlRootUser = cuid();
const mysqlRootUserPassword = encrypt(generatePassword({}));
await prisma.service.update({
where: { id },
data: {
type,
wordpress: { create: { mysqlPassword, mysqlRootUserPassword, mysqlRootUser, mysqlUser } }
}
});
} else if (type === 'vaultwarden') {
await prisma.service.update({
where: { id },
data: {
type
}
});
} else if (type === 'languagetool') {
await prisma.service.update({
where: { id },
data: {
type
}
});
} else if (type === 'n8n') {
await prisma.service.update({
where: { id },
data: {
type
}
});
} else if (type === 'uptimekuma') {
await prisma.service.update({
where: { id },
data: {
type
}
});
} else if (type === 'ghost') {
const defaultEmail = `${cuid()}@example.com`;
const defaultPassword = encrypt(generatePassword({}));
const mariadbUser = cuid();
const mariadbPassword = encrypt(generatePassword({}));
const mariadbRootUser = cuid();
const mariadbRootUserPassword = encrypt(generatePassword({}));
await prisma.service.update({
where: { id },
data: {
type,
ghost: {
create: {
defaultEmail,
defaultPassword,
mariadbUser,
mariadbPassword,
mariadbRootUser,
mariadbRootUserPassword
}
}
}
});
} else if (type === 'meilisearch') {
const masterKey = encrypt(generatePassword({ length: 32 }));
await prisma.service.update({
where: { id },
data: {
type,
meiliSearch: { create: { masterKey } }
}
});
} else if (type === 'umami') {
const umamiAdminPassword = encrypt(generatePassword({}));
const postgresqlUser = cuid();
const postgresqlPassword = encrypt(generatePassword({}));
const postgresqlDatabase = 'umami';
const hashSalt = encrypt(generatePassword({ length: 64 }));
await prisma.service.update({
where: { id },
data: {
type,
umami: {
create: {
umamiAdminPassword,
postgresqlDatabase,
postgresqlPassword,
postgresqlUser,
hashSalt
}
}
}
});
} else if (type === 'hasura') {
const postgresqlUser = cuid();
const postgresqlPassword = encrypt(generatePassword({}));
const postgresqlDatabase = 'hasura';
const graphQLAdminPassword = encrypt(generatePassword({}));
await prisma.service.update({
where: { id },
data: {
type,
hasura: {
create: {
postgresqlDatabase,
postgresqlPassword,
postgresqlUser,
graphQLAdminPassword
}
}
}
});
} else if (type === 'fider') {
const postgresqlUser = cuid();
const postgresqlPassword = encrypt(generatePassword({}));
const postgresqlDatabase = 'fider';
const jwtSecret = encrypt(generatePassword({ length: 64, symbols: true }));
await prisma.service.update({
where: { id },
data: {
type,
fider: {
create: {
postgresqlDatabase,
postgresqlPassword,
postgresqlUser,
jwtSecret
}
}
}
});
} else if (type === 'moodle') {
const defaultUsername = cuid();
const defaultPassword = encrypt(generatePassword({}));
const defaultEmail = `${cuid()} @example.com`;
const mariadbUser = cuid();
const mariadbPassword = encrypt(generatePassword({}));
const mariadbDatabase = 'moodle_db';
const mariadbRootUser = cuid();
const mariadbRootUserPassword = encrypt(generatePassword({}));
await prisma.service.update({
where: { id },
data: {
type,
moodle: {
create: {
defaultUsername,
defaultPassword,
defaultEmail,
mariadbUser,
mariadbPassword,
mariadbDatabase,
mariadbRootUser,
mariadbRootUserPassword
}
}
}
});
} else if (type === 'appwrite') {
const opensslKeyV1 = encrypt(generatePassword({}));
const executorSecret = encrypt(generatePassword({}));
const redisPassword = encrypt(generatePassword({}));
const mariadbHost = `${id}-mariadb`
const mariadbUser = cuid();
const mariadbPassword = encrypt(generatePassword({}));
const mariadbDatabase = 'appwrite';
const mariadbRootUser = cuid();
const mariadbRootUserPassword = encrypt(generatePassword({}));
await prisma.service.update({
where: { id },
data: {
type,
appwrite: {
create: {
opensslKeyV1,
executorSecret,
redisPassword,
mariadbHost,
mariadbUser,
mariadbPassword,
mariadbDatabase,
mariadbRootUser,
mariadbRootUserPassword
}
}
}
});
} else if (type === 'glitchTip') {
const defaultUsername = cuid();
const defaultEmail = `${defaultUsername}@example.com`;
const defaultPassword = encrypt(generatePassword({}));
const postgresqlUser = cuid();
const postgresqlPassword = encrypt(generatePassword({}));
const postgresqlDatabase = 'glitchTip';
const secretKeyBase = encrypt(generatePassword({ length: 64 }));
await prisma.service.update({
where: { id },
data: {
type,
glitchTip: {
create: {
postgresqlDatabase,
postgresqlUser,
postgresqlPassword,
secretKeyBase,
defaultEmail,
defaultUsername,
defaultPassword,
}
}
}
});
} else if (type === 'searxng') {
const secretKey = encrypt(generatePassword({ length: 32, isHex: true }))
const redisPassword = encrypt(generatePassword({}));
await prisma.service.update({
where: { id },
data: {
type,
searxng: {
create: {
secretKey,
redisPassword,
}
}
}
});
} else if (type === 'weblate') {
const adminPassword = encrypt(generatePassword({}))
const postgresqlUser = cuid();
const postgresqlPassword = encrypt(generatePassword({}));
const postgresqlDatabase = 'weblate';
await prisma.service.update({
where: { id },
data: {
type,
weblate: {
create: {
adminPassword,
postgresqlHost: `${id}-postgresql`,
postgresqlPort: 5432,
postgresqlUser,
postgresqlPassword,
postgresqlDatabase,
}
}
}
});
} else if (type === 'taiga') {
const secretKey = encrypt(generatePassword({}))
const erlangSecret = encrypt(generatePassword({}))
const rabbitMQUser = cuid();
const djangoAdminUser = cuid();
const djangoAdminPassword = encrypt(generatePassword({}))
const rabbitMQPassword = encrypt(generatePassword({}))
const postgresqlUser = cuid();
const postgresqlPassword = encrypt(generatePassword({}));
const postgresqlDatabase = 'taiga';
await prisma.service.update({
where: { id },
data: {
type,
taiga: {
create: {
secretKey,
erlangSecret,
djangoAdminUser,
djangoAdminPassword,
rabbitMQUser,
rabbitMQPassword,
postgresqlHost: `${id}-postgresql`,
postgresqlPort: 5432,
postgresqlUser,
postgresqlPassword,
postgresqlDatabase,
}
}
}
});
} else {
await prisma.service.update({
where: { id },
data: {
type
}
});
}
}
export async function removeService({ id }: { id: string }): Promise<void> {
await prisma.serviceSecret.deleteMany({ where: { serviceId: id } });
await prisma.servicePersistentStorage.deleteMany({ where: { serviceId: id } });
await prisma.meiliSearch.deleteMany({ where: { serviceId: id } });
await prisma.fider.deleteMany({ where: { serviceId: id } });
await prisma.ghost.deleteMany({ where: { serviceId: id } });
await prisma.umami.deleteMany({ where: { serviceId: id } });
await prisma.hasura.deleteMany({ where: { serviceId: id } });
await prisma.plausibleAnalytics.deleteMany({ where: { serviceId: id } });
await prisma.minio.deleteMany({ where: { serviceId: id } });
await prisma.vscodeserver.deleteMany({ where: { serviceId: id } });
await prisma.wordpress.deleteMany({ where: { serviceId: id } });
await prisma.glitchTip.deleteMany({ where: { serviceId: id } });
await prisma.moodle.deleteMany({ where: { serviceId: id } });
await prisma.appwrite.deleteMany({ where: { serviceId: id } });
await prisma.searxng.deleteMany({ where: { serviceId: id } });
await prisma.weblate.deleteMany({ where: { serviceId: id } });
await prisma.taiga.deleteMany({ where: { serviceId: id } });
await prisma.service.delete({ where: { id } });
}

File diff suppressed because it is too large Load Diff

View File

@@ -599,6 +599,54 @@ export const glitchTip = [{
isBoolean: false,
isEncrypted: true
},
{
name: 'emailSmtpHost',
isEditable: true,
isLowerCase: false,
isNumber: false,
isBoolean: false,
isEncrypted: false
},
{
name: 'emailSmtpPassword',
isEditable: true,
isLowerCase: false,
isNumber: false,
isBoolean: false,
isEncrypted: true
},
{
name: 'emailSmtpUseSsl',
isEditable: true,
isLowerCase: false,
isNumber: false,
isBoolean: true,
isEncrypted: false
},
{
name: 'emailSmtpUseSsl',
isEditable: true,
isLowerCase: false,
isNumber: false,
isBoolean: true,
isEncrypted: false
},
{
name: 'emailSmtpPort',
isEditable: true,
isLowerCase: false,
isNumber: true,
isBoolean: false,
isEncrypted: false
},
{
name: 'emailSmtpUser',
isEditable: true,
isLowerCase: false,
isNumber: false,
isBoolean: false,
isEncrypted: false
},
{
name: 'defaultEmail',
isEditable: false,
@@ -624,7 +672,7 @@ export const glitchTip = [{
isEncrypted: true
},
{
name: 'defaultFromEmail',
name: 'defaultEmailFrom',
isEditable: true,
isLowerCase: false,
isNumber: false,
@@ -671,3 +719,149 @@ export const glitchTip = [{
isBoolean: true,
isEncrypted: false
}]
export const searxng = [{
name: 'secretKey',
isEditable: false,
isLowerCase: false,
isNumber: false,
isBoolean: false,
isEncrypted: true
},
{
name: 'redisPassword',
isEditable: false,
isLowerCase: false,
isNumber: false,
isBoolean: false,
isEncrypted: true
}]
export const weblate = [{
name: 'adminPassword',
isEditable: false,
isLowerCase: false,
isNumber: false,
isBoolean: false,
isEncrypted: true
},
{
name: 'postgresqlHost',
isEditable: false,
isLowerCase: false,
isNumber: false,
isBoolean: false,
isEncrypted: false
},
{
name: 'postgresqlPort',
isEditable: false,
isLowerCase: false,
isNumber: false,
isBoolean: false,
isEncrypted: false
},
{
name: 'postgresqlUser',
isEditable: false,
isLowerCase: false,
isNumber: false,
isBoolean: false,
isEncrypted: false
},
{
name: 'postgresqlPassword',
isEditable: false,
isLowerCase: false,
isNumber: false,
isBoolean: false,
isEncrypted: true
},
{
name: 'postgresqlDatabase',
isEditable: false,
isLowerCase: false,
isNumber: false,
isBoolean: false,
isEncrypted: false
}]
export const taiga = [{
name: 'secretKey',
isEditable: false,
isLowerCase: false,
isNumber: false,
isBoolean: false,
isEncrypted: true
},
{
name: 'djangoAdminUser',
isEditable: false,
isLowerCase: false,
isNumber: false,
isBoolean: false,
isEncrypted: false
},
{
name: 'djangoAdminPassword',
isEditable: false,
isLowerCase: false,
isNumber: false,
isBoolean: false,
isEncrypted: true
},
{
name: 'rabbitMQUser',
isEditable: false,
isLowerCase: false,
isNumber: false,
isBoolean: false,
isEncrypted: false
},
{
name: 'rabbitMQPassword',
isEditable: false,
isLowerCase: false,
isNumber: false,
isBoolean: false,
isEncrypted: true
},
{
name: 'postgresqlHost',
isEditable: false,
isLowerCase: false,
isNumber: false,
isBoolean: false,
isEncrypted: false
},
{
name: 'postgresqlPort',
isEditable: false,
isLowerCase: false,
isNumber: false,
isBoolean: false,
isEncrypted: false
},
{
name: 'postgresqlUser',
isEditable: false,
isLowerCase: false,
isNumber: false,
isBoolean: false,
isEncrypted: false
},
{
name: 'postgresqlPassword',
isEditable: false,
isLowerCase: false,
isNumber: false,
isBoolean: false,
isEncrypted: true
},
{
name: 'postgresqlDatabase',
isEditable: false,
isLowerCase: false,
isNumber: false,
isBoolean: false,
isEncrypted: false
}]

View File

@@ -0,0 +1,215 @@
export const supportedServiceTypesAndVersions = [
{
name: 'plausibleanalytics',
fancyName: 'Plausible Analytics',
baseImage: 'plausible/analytics',
images: ['bitnami/postgresql:13.2.0', 'yandex/clickhouse-server:21.3.2.5'],
versions: ['latest', 'stable'],
recommendedVersion: 'stable',
ports: {
main: 8000
}
},
{
name: 'nocodb',
fancyName: 'NocoDB',
baseImage: 'nocodb/nocodb',
versions: ['latest'],
recommendedVersion: 'latest',
ports: {
main: 8080
}
},
{
name: 'minio',
fancyName: 'MinIO',
baseImage: 'minio/minio',
versions: ['latest'],
recommendedVersion: 'latest',
ports: {
main: 9001
}
},
{
name: 'vscodeserver',
fancyName: 'VSCode Server',
baseImage: 'codercom/code-server',
versions: ['latest'],
recommendedVersion: 'latest',
ports: {
main: 8080
}
},
{
name: 'wordpress',
fancyName: 'Wordpress',
baseImage: 'wordpress',
images: ['bitnami/mysql:5.7'],
versions: ['latest', 'php8.1', 'php8.0', 'php7.4', 'php7.3'],
recommendedVersion: 'latest',
ports: {
main: 80
}
},
{
name: 'vaultwarden',
fancyName: 'Vaultwarden',
baseImage: 'vaultwarden/server',
versions: ['latest'],
recommendedVersion: 'latest',
ports: {
main: 80
}
},
{
name: 'languagetool',
fancyName: 'LanguageTool',
baseImage: 'silviof/docker-languagetool',
versions: ['latest'],
recommendedVersion: 'latest',
ports: {
main: 8010
}
},
{
name: 'n8n',
fancyName: 'n8n',
baseImage: 'n8nio/n8n',
versions: ['latest'],
recommendedVersion: 'latest',
ports: {
main: 5678
}
},
{
name: 'uptimekuma',
fancyName: 'Uptime Kuma',
baseImage: 'louislam/uptime-kuma',
versions: ['latest'],
recommendedVersion: 'latest',
ports: {
main: 3001
}
},
{
name: 'ghost',
fancyName: 'Ghost',
baseImage: 'bitnami/ghost',
images: ['bitnami/mariadb'],
versions: ['latest'],
recommendedVersion: 'latest',
ports: {
main: 2368
}
},
{
name: 'meilisearch',
fancyName: 'Meilisearch',
baseImage: 'getmeili/meilisearch',
images: [],
versions: ['latest'],
recommendedVersion: 'latest',
ports: {
main: 7700
}
},
{
name: 'umami',
fancyName: 'Umami',
baseImage: 'ghcr.io/mikecao/umami',
images: ['postgres:12-alpine'],
versions: ['postgresql-latest'],
recommendedVersion: 'postgresql-latest',
ports: {
main: 3000
}
},
{
name: 'hasura',
fancyName: 'Hasura',
baseImage: 'hasura/graphql-engine',
images: ['postgres:12-alpine'],
versions: ['latest', 'v2.10.0', 'v2.5.1'],
recommendedVersion: 'v2.10.0',
ports: {
main: 8080
}
},
{
name: 'fider',
fancyName: 'Fider',
baseImage: 'getfider/fider',
images: ['postgres:12-alpine'],
versions: ['stable'],
recommendedVersion: 'stable',
ports: {
main: 3000
}
},
{
name: 'appwrite',
fancyName: 'Appwrite',
baseImage: 'appwrite/appwrite',
images: ['mariadb:10.7', 'redis:6.2-alpine', 'appwrite/telegraf:1.4.0'],
versions: ['latest', '0.15.3'],
recommendedVersion: '0.15.3',
ports: {
main: 80
}
},
// {
// name: 'moodle',
// fancyName: 'Moodle',
// baseImage: 'bitnami/moodle',
// images: [],
// versions: ['latest', 'v4.0.2'],
// recommendedVersion: 'latest',
// ports: {
// main: 8080
// }
// }
{
name: 'glitchTip',
fancyName: 'GlitchTip',
baseImage: 'glitchtip/glitchtip',
images: ['postgres:14-alpine', 'redis:7-alpine'],
versions: ['latest'],
recommendedVersion: 'latest',
ports: {
main: 8000
}
},
{
name: 'searxng',
fancyName: 'SearXNG',
baseImage: 'searxng/searxng',
images: [],
versions: ['latest'],
recommendedVersion: 'latest',
ports: {
main: 8080
}
},
{
name: 'weblate',
fancyName: 'Weblate',
baseImage: 'weblate/weblate',
images: ['postgres:14-alpine', 'redis:6-alpine'],
versions: ['latest'],
recommendedVersion: 'latest',
ports: {
main: 8080
}
},
// {
// name: 'taiga',
// fancyName: 'Taiga',
// baseImage: 'taigaio/taiga-front',
// images: ['postgres:12.3', 'rabbitmq:3.8-management-alpine', 'taigaio/taiga-back', 'taigaio/taiga-events', 'taigaio/taiga-protected'],
// versions: ['latest'],
// recommendedVersion: 'latest',
// ports: {
// main: 80
// }
// },
];

View File

@@ -21,7 +21,6 @@ export default fp<FastifyJWTOptions>(async (fastify, opts) => {
try {
await request.jwtVerify()
} catch (err) {
console.log(err)
reply.send(err)
}
})

View File

@@ -3,16 +3,23 @@ import crypto from 'node:crypto'
import jsonwebtoken from 'jsonwebtoken';
import axios from 'axios';
import { FastifyReply } from 'fastify';
import fs from 'fs/promises';
import yaml from 'js-yaml';
import { day } from '../../../../lib/dayjs';
import { setDefaultBaseImage, setDefaultConfiguration } from '../../../../lib/buildPacks/common';
import { checkDomainsIsValidInDNS, checkDoubleBranch, checkExposedPort, decrypt, encrypt, errorHandler, executeDockerCmd, generateSshKeyPair, getContainerUsage, getDomain, getFreeExposedPort, isDev, isDomainConfigured, listSettings, prisma, stopBuild, uniqueName } from '../../../../lib/common';
import { makeLabelForStandaloneApplication, setDefaultBaseImage, setDefaultConfiguration } from '../../../../lib/buildPacks/common';
import { checkDomainsIsValidInDNS, checkDoubleBranch, checkExposedPort, createDirectories, decrypt, defaultComposeConfiguration, encrypt, errorHandler, executeDockerCmd, generateSshKeyPair, getContainerUsage, getDomain, isDev, isDomainConfigured, listSettings, prisma, stopBuild, uniqueName } from '../../../../lib/common';
import { checkContainer, formatLabelsOnDocker, isContainerExited, removeContainer } from '../../../../lib/docker';
import { scheduler } from '../../../../lib/scheduler';
import type { FastifyRequest } from 'fastify';
import type { GetImages, CancelDeployment, CheckDNS, CheckRepository, DeleteApplication, DeleteSecret, DeleteStorage, GetApplicationLogs, GetBuildIdLogs, GetBuildLogs, SaveApplication, SaveApplicationSettings, SaveApplicationSource, SaveDeployKey, SaveDestination, SaveSecret, SaveStorage, DeployApplication, CheckDomain, StopPreviewApplication } from './types';
import { OnlyId } from '../../../../types';
function filterObject(obj, callback) {
return Object.fromEntries(Object.entries(obj).
filter(([key, val]) => callback(val, key)));
}
export async function listApplications(request: FastifyRequest) {
try {
const { teamId } = request.user
@@ -34,7 +41,7 @@ export async function getImages(request: FastifyRequest<GetImages>) {
const { buildPack, deploymentType } = request.body
let publishDirectory = undefined;
let port = undefined
const { baseImage, baseBuildImage, baseBuildImages, baseImages, } = setDefaultBaseImage(
const { baseImage, baseBuildImage, baseBuildImages, baseImages } = setDefaultBaseImage(
buildPack, deploymentType
);
if (buildPack === 'nextjs') {
@@ -56,8 +63,7 @@ export async function getImages(request: FastifyRequest<GetImages>) {
}
}
return { baseBuildImage, baseBuildImages, publishDirectory, port }
return { baseImage, baseImages, baseBuildImage, baseBuildImages, publishDirectory, port }
} catch ({ status, message }) {
return errorHandler({ status, message })
}
@@ -75,7 +81,6 @@ export async function getApplicationStatus(request: FastifyRequest<OnlyId>) {
isExited = await isContainerExited(application.destinationDocker.id, id);
}
return {
isQueueActive: scheduler.workers.has('deployApplication'),
isRunning,
isExited,
};
@@ -151,7 +156,8 @@ export async function getApplicationFromDB(id: string, teamId: string) {
settings: true,
gitSource: { include: { githubApp: true, gitlabApp: true } },
secrets: true,
persistentStorage: true
persistentStorage: true,
connectedDatabase: true
}
});
if (!application) {
@@ -178,32 +184,39 @@ export async function getApplicationFromDB(id: string, teamId: string) {
}
export async function getApplicationFromDBWebhook(projectId: number, branch: string) {
try {
let application = await prisma.application.findFirst({
let applications = await prisma.application.findMany({
where: { projectId, branch, settings: { autodeploy: true } },
include: {
destinationDocker: true,
settings: true,
gitSource: { include: { githubApp: true, gitlabApp: true } },
secrets: true,
persistentStorage: true
persistentStorage: true,
connectedDatabase: true
}
});
if (!application) {
if (applications.length === 0) {
throw { status: 500, message: 'Application not configured.' }
}
application = decryptApplication(application);
const { baseImage, baseBuildImage, baseBuildImages, baseImages } = setDefaultBaseImage(
application.buildPack
);
applications = applications.map((application: any) => {
application = decryptApplication(application);
const { baseImage, baseBuildImage, baseBuildImages, baseImages } = setDefaultBaseImage(
application.buildPack
);
// Set default build images
if (!application.baseImage) {
application.baseImage = baseImage;
}
if (!application.baseBuildImage) {
application.baseBuildImage = baseBuildImage;
}
return { ...application, baseBuildImages, baseImages };
// Set default build images
if (!application.baseImage) {
application.baseImage = baseImage;
}
if (!application.baseBuildImage) {
application.baseBuildImage = baseBuildImage;
}
application.baseBuildImages = baseBuildImages;
application.baseImages = baseImages;
return application
})
return applications;
} catch ({ status, message }) {
return errorHandler({ status, message })
@@ -231,16 +244,16 @@ export async function saveApplication(request: FastifyRequest<SaveApplication>,
denoOptions,
baseImage,
baseBuildImage,
deploymentType
deploymentType,
baseDatabaseBranch
} = request.body
if (port) port = Number(port);
if (exposePort) {
exposePort = Number(exposePort);
}
const { destinationDockerId } = await prisma.application.findUnique({ where: { id } })
if (exposePort) await checkExposedPort({ id, exposePort, dockerId: destinationDockerId })
const { destinationDocker: { engine, remoteEngine, remoteIpAddress }, exposePort: configuredPort } = await prisma.application.findUnique({ where: { id }, include: { destinationDocker: true } })
if (exposePort) await checkExposedPort({ id, configuredPort, exposePort, engine, remoteEngine, remoteIpAddress })
if (denoOptions) denoOptions = denoOptions.trim();
const defaultConfiguration = await setDefaultConfiguration({
buildPack,
@@ -253,22 +266,43 @@ export async function saveApplication(request: FastifyRequest<SaveApplication>,
dockerFileLocation,
denoMainFile
});
await prisma.application.update({
where: { id },
data: {
name,
fqdn,
exposePort,
pythonWSGI,
pythonModule,
pythonVariable,
denoOptions,
baseImage,
baseBuildImage,
deploymentType,
...defaultConfiguration
}
});
if (baseDatabaseBranch) {
await prisma.application.update({
where: { id },
data: {
name,
fqdn,
exposePort,
pythonWSGI,
pythonModule,
pythonVariable,
denoOptions,
baseImage,
baseBuildImage,
deploymentType,
...defaultConfiguration,
connectedDatabase: { update: { hostedDatabaseDBName: baseDatabaseBranch } }
}
});
} else {
await prisma.application.update({
where: { id },
data: {
name,
fqdn,
exposePort,
pythonWSGI,
pythonModule,
pythonVariable,
denoOptions,
baseImage,
baseBuildImage,
deploymentType,
...defaultConfiguration
}
});
}
return reply.code(201).send();
} catch ({ status, message }) {
return errorHandler({ status, message })
@@ -279,15 +313,15 @@ export async function saveApplication(request: FastifyRequest<SaveApplication>,
export async function saveApplicationSettings(request: FastifyRequest<SaveApplicationSettings>, reply: FastifyReply) {
try {
const { id } = request.params
const { debug, previews, dualCerts, autodeploy, branch, projectId, isBot } = request.body
const isDouble = await checkDoubleBranch(branch, projectId);
if (isDouble && autodeploy) {
await prisma.applicationSettings.updateMany({ where: { application: { branch, projectId } }, data: { autodeploy: false } })
throw { status: 500, message: 'Cannot activate automatic deployments until only one application is defined for this repository / branch.' }
}
const { debug, previews, dualCerts, autodeploy, branch, projectId, isBot, isDBBranching } = request.body
// const isDouble = await checkDoubleBranch(branch, projectId);
// if (isDouble && autodeploy) {
// await prisma.applicationSettings.updateMany({ where: { application: { branch, projectId } }, data: { autodeploy: false } })
// throw { status: 500, message: 'Cannot activate automatic deployments until only one application is defined for this repository / branch.' }
// }
await prisma.application.update({
where: { id },
data: { fqdn: isBot ? null : undefined, settings: { update: { debug, previews, dualCerts, autodeploy, isBot } } },
data: { fqdn: isBot ? null : undefined, settings: { update: { debug, previews, dualCerts, autodeploy, isBot, isDBBranching } } },
include: { destinationDocker: true }
});
return reply.code(201).send();
@@ -315,6 +349,113 @@ export async function stopPreviewApplication(request: FastifyRequest<StopPreview
return errorHandler({ status, message })
}
}
export async function restartApplication(request: FastifyRequest<OnlyId>, reply: FastifyReply) {
try {
const { id } = request.params
const { teamId } = request.user
let application: any = await getApplicationFromDB(id, teamId);
if (application?.destinationDockerId) {
const buildId = cuid();
const { id: dockerId, network } = application.destinationDocker;
const { secrets, pullmergeRequestId, port, repository, persistentStorage, id: applicationId, buildPack, exposePort } = application;
const envs = [
`PORT=${port}`
];
if (secrets.length > 0) {
secrets.forEach((secret) => {
if (pullmergeRequestId) {
if (secret.isPRMRSecret) {
envs.push(`${secret.name}=${secret.value}`);
}
} else {
if (!secret.isPRMRSecret) {
envs.push(`${secret.name}=${secret.value}`);
}
}
});
}
const { workdir } = await createDirectories({ repository, buildId });
const labels = []
let image = null
const { stdout: container } = await executeDockerCmd({ dockerId, command: `docker container ls --filter 'label=com.docker.compose.service=${id}' --format '{{json .}}'` })
const containersArray = container.trim().split('\n');
for (const container of containersArray) {
const containerObj = formatLabelsOnDocker(container);
image = containerObj[0].Image
Object.keys(containerObj[0].Labels).forEach(function (key) {
if (key.startsWith('coolify')) {
labels.push(`${key}=${containerObj[0].Labels[key]}`)
}
})
}
let imageFound = false;
try {
await executeDockerCmd({
dockerId,
command: `docker image inspect ${image}`
})
imageFound = true;
} catch (error) {
//
}
if (!imageFound) {
throw { status: 500, message: 'Image not found, cannot restart application.' }
}
await fs.writeFile(`${workdir}/.env`, envs.join('\n'));
let envFound = false;
try {
envFound = !!(await fs.stat(`${workdir}/.env`));
} catch (error) {
//
}
const volumes =
persistentStorage?.map((storage) => {
return `${applicationId}${storage.path.replace(/\//gi, '-')}:${buildPack !== 'docker' ? '/app' : ''
}${storage.path}`;
}) || [];
const composeVolumes = volumes.map((volume) => {
return {
[`${volume.split(':')[0]}`]: {
name: volume.split(':')[0]
}
};
});
const composeFile = {
version: '3.8',
services: {
[applicationId]: {
image,
container_name: applicationId,
volumes,
env_file: envFound ? [`${workdir}/.env`] : [],
labels,
depends_on: [],
expose: [port],
...(exposePort ? { ports: [`${exposePort}:${port}`] } : {}),
...defaultComposeConfiguration(network),
}
},
networks: {
[network]: {
external: true
}
},
volumes: Object.assign({}, ...composeVolumes)
};
await fs.writeFile(`${workdir}/docker-compose.yml`, yaml.dump(composeFile));
await executeDockerCmd({ dockerId, command: `docker stop -t 0 ${id}` })
await executeDockerCmd({ dockerId, command: `docker rm ${id}` })
await executeDockerCmd({ dockerId, command: `docker compose --project-directory ${workdir} up -d` })
return reply.code(201).send();
}
throw { status: 500, message: 'Application cannot be restarted.' }
} catch ({ status, message }) {
return errorHandler({ status, message })
}
}
export async function stopApplication(request: FastifyRequest<OnlyId>, reply: FastifyReply) {
try {
const { id } = request.params
@@ -335,12 +476,14 @@ export async function stopApplication(request: FastifyRequest<OnlyId>, reply: Fa
export async function deleteApplication(request: FastifyRequest<DeleteApplication>, reply: FastifyReply) {
try {
const { id } = request.params
const { force } = request.body
const { teamId } = request.user
const application = await prisma.application.findUnique({
where: { id },
include: { destinationDocker: true }
});
if (application?.destinationDockerId && application.destinationDocker?.network) {
if (!force && application?.destinationDockerId && application.destinationDocker?.network) {
const { stdout: containers } = await executeDockerCmd({
dockerId: application.destinationDocker.id,
command: `docker ps -a --filter network=${application.destinationDocker.network} --filter name=${id} --format '{{json .}}'`
@@ -359,6 +502,7 @@ export async function deleteApplication(request: FastifyRequest<DeleteApplicatio
await prisma.build.deleteMany({ where: { applicationId: id } });
await prisma.secret.deleteMany({ where: { applicationId: id } });
await prisma.applicationPersistentStorage.deleteMany({ where: { applicationId: id } });
await prisma.applicationConnectedDatabase.deleteMany({ where: { applicationId: id } });
if (teamId === '0') {
await prisma.application.deleteMany({ where: { id } });
} else {
@@ -382,20 +526,22 @@ export async function checkDomain(request: FastifyRequest<CheckDomain>) {
export async function checkDNS(request: FastifyRequest<CheckDNS>) {
try {
const { id } = request.params
let { exposePort, fqdn, forceSave, dualCerts } = request.body
if (fqdn) fqdn = fqdn.toLowerCase();
if (!fqdn) {
return {}
} else {
fqdn = fqdn.toLowerCase();
}
if (exposePort) exposePort = Number(exposePort);
const { destinationDocker: { id: dockerId, remoteIpAddress, remoteEngine }, exposePort: configuredPort } = await prisma.application.findUnique({ where: { id }, include: { destinationDocker: true } })
const { destinationDocker: { engine, remoteIpAddress, remoteEngine }, exposePort: configuredPort } = await prisma.application.findUnique({ where: { id }, include: { destinationDocker: true } })
const { isDNSCheckEnabled } = await prisma.setting.findFirst({});
const found = await isDomainConfigured({ id, fqdn, remoteIpAddress });
if (found) {
throw { status: 500, message: `Domain ${getDomain(fqdn).replace('www.', '')} is already in use!` }
}
if (exposePort) await checkExposedPort({ id, configuredPort, exposePort, dockerId, remoteIpAddress })
if (exposePort) await checkExposedPort({ id, configuredPort, exposePort, engine, remoteEngine, remoteIpAddress })
if (isDNSCheckEnabled && !isDev && !forceSave) {
let hostname = request.hostname.split(':')[0];
if (remoteEngine) hostname = remoteIpAddress;
@@ -452,7 +598,10 @@ export async function deployApplication(request: FastifyRequest<DeployApplicatio
data: {
id: buildId,
applicationId: id,
sourceBranch: branch,
branch: application.branch,
pullmergeRequestId: pullmergeRequestId?.toString(),
forceRebuild,
destinationDockerId: application.destinationDocker?.id,
gitSourceId: application.gitSource?.id,
githubAppId: application.gitSource?.githubApp?.id,
@@ -461,24 +610,6 @@ export async function deployApplication(request: FastifyRequest<DeployApplicatio
type: 'manual'
}
});
if (pullmergeRequestId) {
scheduler.workers.get('deployApplication').postMessage({
build_id: buildId,
type: 'manual',
...application,
sourceBranch: branch,
pullmergeRequestId,
forceRebuild
});
} else {
scheduler.workers.get('deployApplication').postMessage({
build_id: buildId,
type: 'manual',
...application,
forceRebuild
});
}
return {
buildId
};
@@ -576,12 +707,12 @@ export async function saveRepository(request, reply) {
data: { repository, branch, projectId, settings: { update: { autodeploy, isPublicRepository } } }
});
}
if (!isPublicRepository) {
const isDouble = await checkDoubleBranch(branch, projectId);
if (isDouble) {
await prisma.applicationSettings.updateMany({ where: { application: { branch, projectId } }, data: { autodeploy: false, isPublicRepository } })
}
}
// if (!isPublicRepository) {
// const isDouble = await checkDoubleBranch(branch, projectId);
// if (isDouble) {
// await prisma.applicationSettings.updateMany({ where: { application: { branch, projectId } }, data: { autodeploy: false, isPublicRepository } })
// }
// }
return reply.code(201).send()
} catch ({ status, message }) {
return errorHandler({ status, message })
@@ -630,6 +761,16 @@ export async function saveBuildPack(request, reply) {
return errorHandler({ status, message })
}
}
export async function saveConnectedDatabase(request, reply) {
try {
const { id } = request.params
const { databaseId, type } = request.body
await prisma.application.update({ where: { id }, data: { connectedDatabase: { upsert: { create: { database: { connect: { id: databaseId } }, hostedDatabaseType: type }, update: { database: { connect: { id: databaseId } }, hostedDatabaseType: type } } } } })
return reply.code(201).send()
} catch ({ status, message }) {
return errorHandler({ status, message })
}
}
export async function getSecrets(request: FastifyRequest<OnlyId>) {
try {
@@ -786,7 +927,6 @@ export async function getPreviews(request: FastifyRequest<OnlyId>) {
})
}
} catch ({ status, message }) {
console.log({ status, message })
return errorHandler({ status, message })
}
}
@@ -879,8 +1019,13 @@ export async function getBuildIdLogs(request: FastifyRequest<GetBuildIdLogs>) {
orderBy: { time: 'asc' }
});
const data = await prisma.build.findFirst({ where: { id: buildId } });
const createdAt = day(data.createdAt).utc();
return {
logs,
logs: logs.map(log => {
log.time = Number(log.time)
return log
}),
took: day().diff(createdAt) / 1000,
status: data?.status || 'queued'
}
} catch ({ status, message }) {
@@ -955,4 +1100,59 @@ export async function cancelDeployment(request: FastifyRequest<CancelDeployment>
} catch ({ status, message }) {
return errorHandler({ status, message })
}
}
export async function createdBranchDatabase(database: any, baseDatabaseBranch: string, pullmergeRequestId: string) {
try {
if (!baseDatabaseBranch) return
const { id, type, destinationDockerId, rootUser, rootUserPassword, dbUser } = database;
if (destinationDockerId) {
if (type === 'postgresql') {
const decryptedRootUserPassword = decrypt(rootUserPassword);
await executeDockerCmd({
dockerId: destinationDockerId,
command: `docker exec ${id} pg_dump -d "postgresql://postgres:${decryptedRootUserPassword}@${id}:5432/${baseDatabaseBranch}" --encoding=UTF8 --schema-only -f /tmp/${baseDatabaseBranch}.dump`
})
await executeDockerCmd({
dockerId: destinationDockerId,
command: `docker exec ${id} psql postgresql://postgres:${decryptedRootUserPassword}@${id}:5432 -c "CREATE DATABASE branch_${pullmergeRequestId}"`
})
await executeDockerCmd({
dockerId: destinationDockerId,
command: `docker exec ${id} psql -d "postgresql://postgres:${decryptedRootUserPassword}@${id}:5432/branch_${pullmergeRequestId}" -f /tmp/${baseDatabaseBranch}.dump`
})
await executeDockerCmd({
dockerId: destinationDockerId,
command: `docker exec ${id} psql postgresql://postgres:${decryptedRootUserPassword}@${id}:5432 -c "ALTER DATABASE branch_${pullmergeRequestId} OWNER TO ${dbUser}"`
})
}
}
} catch ({ status, message }) {
return errorHandler({ status, message })
}
}
export async function removeBranchDatabase(database: any, pullmergeRequestId: string) {
try {
const { id, type, destinationDockerId, rootUser, rootUserPassword } = database;
if (destinationDockerId) {
if (type === 'postgresql') {
const decryptedRootUserPassword = decrypt(rootUserPassword);
// Terminate all connections to the database
await executeDockerCmd({
dockerId: destinationDockerId,
command: `docker exec ${id} psql postgresql://postgres:${decryptedRootUserPassword}@${id}:5432 -c "SELECT pg_terminate_backend(pg_stat_activity.pid) FROM pg_stat_activity WHERE pg_stat_activity.datname = 'branch_${pullmergeRequestId}' AND pid <> pg_backend_pid();"`
})
await executeDockerCmd({
dockerId: destinationDockerId,
command: `docker exec ${id} psql postgresql://postgres:${decryptedRootUserPassword}@${id}:5432 -c "DROP DATABASE branch_${pullmergeRequestId}"`
})
}
}
} catch ({ status, message }) {
return errorHandler({ status, message })
}
}

View File

@@ -1,6 +1,6 @@
import { FastifyPluginAsync } from 'fastify';
import { OnlyId } from '../../../../types';
import { cancelDeployment, checkDNS, checkDomain, checkRepository, deleteApplication, deleteSecret, deleteStorage, deployApplication, getApplication, getApplicationLogs, getApplicationStatus, getBuildIdLogs, getBuildLogs, getBuildPack, getGitHubToken, getGitLabSSHKey, getImages, getPreviews, getSecrets, getStorages, getUsage, listApplications, newApplication, saveApplication, saveApplicationSettings, saveApplicationSource, saveBuildPack, saveDeployKey, saveDestination, saveGitLabSSHKey, saveRepository, saveSecret, saveStorage, stopApplication, stopPreviewApplication } from './handlers';
import { cancelDeployment, checkDNS, checkDomain, checkRepository, deleteApplication, deleteSecret, deleteStorage, deployApplication, getApplication, getApplicationLogs, getApplicationStatus, getBuildIdLogs, getBuildLogs, getBuildPack, getGitHubToken, getGitLabSSHKey, getImages, getPreviews, getSecrets, getStorages, getUsage, listApplications, newApplication, restartApplication, saveApplication, saveApplicationSettings, saveApplicationSource, saveBuildPack, saveConnectedDatabase, saveDeployKey, saveDestination, saveGitLabSSHKey, saveRepository, saveSecret, saveStorage, stopApplication, stopPreviewApplication } from './handlers';
import type { CancelDeployment, CheckDNS, CheckDomain, CheckRepository, DeleteApplication, DeleteSecret, DeleteStorage, DeployApplication, GetApplicationLogs, GetBuildIdLogs, GetBuildLogs, GetImages, SaveApplication, SaveApplicationSettings, SaveApplicationSource, SaveDeployKey, SaveDestination, SaveSecret, SaveStorage, StopPreviewApplication } from './types';
@@ -19,6 +19,7 @@ const root: FastifyPluginAsync = async (fastify): Promise<void> => {
fastify.get<OnlyId>('/:id/status', async (request) => await getApplicationStatus(request));
fastify.post<OnlyId>('/:id/restart', async (request, reply) => await restartApplication(request, reply));
fastify.post<OnlyId>('/:id/stop', async (request, reply) => await stopApplication(request, reply));
fastify.post<StopPreviewApplication>('/:id/stop/preview', async (request, reply) => await stopPreviewApplication(request, reply));
@@ -54,6 +55,8 @@ const root: FastifyPluginAsync = async (fastify): Promise<void> => {
fastify.get('/:id/configuration/buildpack', async (request) => await getBuildPack(request));
fastify.post('/:id/configuration/buildpack', async (request, reply) => await saveBuildPack(request, reply));
fastify.post('/:id/configuration/database', async (request, reply) => await saveConnectedDatabase(request, reply));
fastify.get<OnlyId>('/:id/configuration/sshkey', async (request) => await getGitLabSSHKey(request));
fastify.post<OnlyId>('/:id/configuration/sshkey', async (request, reply) => await saveGitLabSSHKey(request, reply));

View File

@@ -20,15 +20,17 @@ export interface SaveApplication extends OnlyId {
denoOptions: string,
baseImage: string,
baseBuildImage: string,
deploymentType: string
deploymentType: string,
baseDatabaseBranch: string
}
}
export interface SaveApplicationSettings extends OnlyId {
Querystring: { domain: string; };
Body: { debug: boolean; previews: boolean; dualCerts: boolean; autodeploy: boolean; branch: string; projectId: number; isBot: boolean; };
Body: { debug: boolean; previews: boolean; dualCerts: boolean; autodeploy: boolean; branch: string; projectId: number; isBot: boolean; isDBBranching: boolean };
}
export interface DeleteApplication extends OnlyId {
Querystring: { domain: string; };
Body: { force: boolean }
}
export interface CheckDomain extends OnlyId {
Querystring: { domain: string; };

View File

@@ -11,6 +11,7 @@ const root: FastifyPluginAsync = async (fastify): Promise<void> => {
version,
whiteLabeled: process.env.COOLIFY_WHITE_LABELED === 'true',
whiteLabeledIcon: process.env.COOLIFY_WHITE_LABELED_ICON,
isRegistrationEnabled: settings.isRegistrationEnabled,
}
} catch ({ status, message }) {
return errorHandler({ status, message })

View File

@@ -3,11 +3,11 @@ import type { FastifyRequest } from 'fastify';
import { FastifyReply } from 'fastify';
import yaml from 'js-yaml';
import fs from 'fs/promises';
import { ComposeFile, createDirectories, decrypt, encrypt, errorHandler, executeDockerCmd, generateDatabaseConfiguration, generatePassword, getContainerUsage, getDatabaseImage, getDatabaseVersions, getFreePublicPort, listSettings, makeLabelForStandaloneDatabase, prisma, startTraefikTCPProxy, stopDatabaseContainer, stopTcpHttpProxy, supportedDatabaseTypesAndVersions, uniqueName, updatePasswordInDb } from '../../../../lib/common';
import { ComposeFile, createDirectories, decrypt, defaultComposeConfiguration, encrypt, errorHandler, executeDockerCmd, generateDatabaseConfiguration, generatePassword, getContainerUsage, getDatabaseImage, getDatabaseVersions, getFreePublicPort, listSettings, makeLabelForStandaloneDatabase, prisma, startTraefikTCPProxy, stopDatabaseContainer, stopTcpHttpProxy, supportedDatabaseTypesAndVersions, uniqueName, updatePasswordInDb } from '../../../../lib/common';
import { day } from '../../../../lib/dayjs';
import { GetDatabaseLogs, OnlyId, SaveDatabase, SaveDatabaseDestination, SaveDatabaseSettings, SaveVersion } from '../../../../types';
import { SaveDatabaseType } from './types';
import type { OnlyId } from '../../../../types';
import type { DeleteDatabase, DeleteDatabaseSecret, GetDatabaseLogs, SaveDatabase, SaveDatabaseDestination, SaveDatabaseSecret, SaveDatabaseSettings, SaveDatabaseType, SaveVersion } from './types';
export async function listDatabases(request: FastifyRequest) {
try {
@@ -29,9 +29,9 @@ export async function newDatabase(request: FastifyRequest, reply: FastifyReply)
const name = uniqueName();
const dbUser = cuid();
const dbUserPassword = encrypt(generatePassword());
const dbUserPassword = encrypt(generatePassword({}));
const rootUser = cuid();
const rootUserPassword = encrypt(generatePassword());
const rootUserPassword = encrypt(generatePassword({}));
const defaultDatabase = cuid();
const { id } = await prisma.database.create({
@@ -61,16 +61,18 @@ export async function getDatabaseStatus(request: FastifyRequest<OnlyId>) {
where: { id, teams: { some: { id: teamId === '0' ? undefined : teamId } } },
include: { destinationDocker: true, settings: true }
});
const { destinationDockerId, destinationDocker } = database;
if (destinationDockerId) {
try {
const { stdout } = await executeDockerCmd({ dockerId: destinationDocker.id, command: `docker inspect --format '{{json .State}}' ${id}` })
if (database) {
const { destinationDockerId, destinationDocker } = database;
if (destinationDockerId) {
try {
const { stdout } = await executeDockerCmd({ dockerId: destinationDocker.id, command: `docker inspect --format '{{json .State}}' ${id}` })
if (JSON.parse(stdout).Running) {
isRunning = true;
if (JSON.parse(stdout).Running) {
isRunning = true;
}
} catch (error) {
//
}
} catch (error) {
//
}
}
return {
@@ -92,15 +94,14 @@ export async function getDatabase(request: FastifyRequest<OnlyId>) {
if (!database) {
throw { status: 404, message: 'Database not found.' }
}
const { arch } = await listSettings();
const settings = await listSettings();
if (database.dbUserPassword) database.dbUserPassword = decrypt(database.dbUserPassword);
if (database.rootUserPassword) database.rootUserPassword = decrypt(database.rootUserPassword);
const configuration = generateDatabaseConfiguration(database, arch);
const settings = await listSettings();
const configuration = generateDatabaseConfiguration(database, settings.arch);
return {
privatePort: configuration?.privatePort,
database,
versions: await getDatabaseVersions(database.type, arch),
versions: await getDatabaseVersions(database.type, settings.arch),
settings
};
} catch ({ status, message }) {
@@ -167,6 +168,7 @@ export async function saveDatabaseDestination(request: FastifyRequest<SaveDataba
const { id } = request.params;
const { destinationId } = request.body;
const { arch } = await listSettings();
await prisma.database.update({
where: { id },
data: { destinationDocker: { connect: { id: destinationId } } }
@@ -181,7 +183,7 @@ export async function saveDatabaseDestination(request: FastifyRequest<SaveDataba
if (destinationDockerId) {
if (type && version) {
const baseImage = getDatabaseImage(type);
const baseImage = getDatabaseImage(type, arch);
executeDockerCmd({ dockerId, command: `docker pull ${baseImage}:${version}` })
}
}
@@ -219,7 +221,7 @@ export async function startDatabase(request: FastifyRequest<OnlyId>) {
const database = await prisma.database.findFirst({
where: { id, teams: { some: { id: teamId === '0' ? undefined : teamId } } },
include: { destinationDocker: true, settings: true }
include: { destinationDocker: true, settings: true, databaseSecret: true }
});
const { arch } = await listSettings();
if (database.dbUserPassword) database.dbUserPassword = decrypt(database.dbUserPassword);
@@ -229,7 +231,8 @@ export async function startDatabase(request: FastifyRequest<OnlyId>) {
destinationDockerId,
destinationDocker,
publicPort,
settings: { isPublic }
settings: { isPublic },
databaseSecret
} = database;
const { privatePort, command, environmentVariables, image, volume, ulimits } =
generateDatabaseConfiguration(database, arch);
@@ -239,7 +242,11 @@ export async function startDatabase(request: FastifyRequest<OnlyId>) {
const labels = await makeLabelForStandaloneDatabase({ id, image, volume });
const { workdir } = await createDirectories({ repository: type, buildId: id });
if (databaseSecret.length > 0) {
databaseSecret.forEach((secret) => {
environmentVariables[secret.name] = decrypt(secret.value);
});
}
const composeFile: ComposeFile = {
version: '3.8',
services: {
@@ -247,20 +254,11 @@ export async function startDatabase(request: FastifyRequest<OnlyId>) {
container_name: id,
image,
command,
networks: [network],
environment: environmentVariables,
volumes: [volume],
ulimits,
labels,
restart: 'always',
deploy: {
restart_policy: {
condition: 'on-failure',
delay: '5s',
max_attempts: 3,
window: '120s'
}
}
...defaultComposeConfiguration(network),
}
},
networks: {
@@ -270,28 +268,16 @@ export async function startDatabase(request: FastifyRequest<OnlyId>) {
},
volumes: {
[volumeName]: {
external: true
name: volumeName,
}
}
};
const composeFileDestination = `${workdir}/docker-compose.yaml`;
await fs.writeFile(composeFileDestination, yaml.dump(composeFile));
try {
await executeDockerCmd({ dockerId: destinationDocker.id, command: `docker volume create ${volumeName}` })
} catch (error) {
console.log(error);
}
try {
await executeDockerCmd({ dockerId: destinationDocker.id, command: `docker compose -f ${composeFileDestination} up -d` })
if (isPublic) await startTraefikTCPProxy(destinationDocker, id, publicPort, privatePort);
return {};
} catch (error) {
console.log(error)
throw {
error
};
}
await executeDockerCmd({ dockerId: destinationDocker.id, command: `docker compose -f ${composeFileDestination} up -d` })
if (isPublic) await startTraefikTCPProxy(destinationDocker, id, publicPort, privatePort);
return {};
} catch ({ status, message }) {
return errorHandler({ status, message })
}
@@ -360,21 +346,25 @@ export async function getDatabaseLogs(request: FastifyRequest<GetDatabaseLogs>)
return errorHandler({ status, message })
}
}
export async function deleteDatabase(request: FastifyRequest<OnlyId>) {
export async function deleteDatabase(request: FastifyRequest<DeleteDatabase>) {
try {
const teamId = request.user.teamId;
const { id } = request.params;
const { force } = request.body;
const database = await prisma.database.findFirst({
where: { id, teams: { some: { id: teamId === '0' ? undefined : teamId } } },
include: { destinationDocker: true, settings: true }
});
if (database.dbUserPassword) database.dbUserPassword = decrypt(database.dbUserPassword);
if (database.rootUserPassword) database.rootUserPassword = decrypt(database.rootUserPassword);
if (database.destinationDockerId) {
const everStarted = await stopDatabaseContainer(database);
if (everStarted) await stopTcpHttpProxy(id, database.destinationDocker, database.publicPort);
if (!force) {
if (database.dbUserPassword) database.dbUserPassword = decrypt(database.dbUserPassword);
if (database.rootUserPassword) database.rootUserPassword = decrypt(database.rootUserPassword);
if (database.destinationDockerId) {
const everStarted = await stopDatabaseContainer(database);
if (everStarted) await stopTcpHttpProxy(id, database.destinationDocker, database.publicPort);
}
}
await prisma.databaseSettings.deleteMany({ where: { databaseId: id } });
await prisma.databaseSecret.deleteMany({ where: { databaseId: id } });
await prisma.database.delete({ where: { id } });
return {}
} catch ({ status, message }) {
@@ -433,9 +423,13 @@ export async function saveDatabaseSettings(request: FastifyRequest<SaveDatabaseS
const { id } = request.params;
const { isPublic, appendOnly = true } = request.body;
const { destinationDocker: { id: dockerId } } = await prisma.database.findUnique({ where: { id }, include: { destinationDocker: true } })
const publicPort = await getFreePublicPort(id, dockerId);
let publicPort = null
const { destinationDocker: { remoteEngine, engine, remoteIpAddress } } = await prisma.database.findUnique({ where: { id }, include: { destinationDocker: true } })
if (isPublic) {
publicPort = await getFreePublicPort({ id, remoteEngine, engine, remoteIpAddress });
}
await prisma.database.update({
where: { id },
data: {
@@ -466,4 +460,69 @@ export async function saveDatabaseSettings(request: FastifyRequest<SaveDatabaseS
} catch ({ status, message }) {
return errorHandler({ status, message })
}
}
}
export async function getDatabaseSecrets(request: FastifyRequest<OnlyId>) {
try {
const { id } = request.params
let secrets = await prisma.databaseSecret.findMany({
where: { databaseId: id },
orderBy: { createdAt: 'desc' }
});
secrets = secrets.map((secret) => {
secret.value = decrypt(secret.value);
return secret;
});
return {
secrets
}
} catch ({ status, message }) {
return errorHandler({ status, message })
}
}
export async function saveDatabaseSecret(request: FastifyRequest<SaveDatabaseSecret>, reply: FastifyReply) {
try {
const { id } = request.params
let { name, value, isNew } = request.body
if (isNew) {
const found = await prisma.databaseSecret.findFirst({ where: { name, databaseId: id } });
if (found) {
throw `Secret ${name} already exists.`
} else {
value = encrypt(value.trim());
await prisma.databaseSecret.create({
data: { name, value, database: { connect: { id } } }
});
}
} else {
value = encrypt(value.trim());
const found = await prisma.databaseSecret.findFirst({ where: { databaseId: id, name } });
if (found) {
await prisma.databaseSecret.updateMany({
where: { databaseId: id, name },
data: { value }
});
} else {
await prisma.databaseSecret.create({
data: { name, value, database: { connect: { id } } }
});
}
}
return reply.code(201).send()
} catch ({ status, message }) {
return errorHandler({ status, message })
}
}
export async function deleteDatabaseSecret(request: FastifyRequest<DeleteDatabaseSecret>) {
try {
const { id } = request.params
const { name } = request.body
await prisma.databaseSecret.deleteMany({ where: { databaseId: id, name } });
return {}
} catch ({ status, message }) {
return errorHandler({ status, message })
}
}

View File

@@ -1,8 +1,9 @@
import { FastifyPluginAsync } from 'fastify';
import { deleteDatabase, getDatabase, getDatabaseLogs, getDatabaseStatus, getDatabaseTypes, getDatabaseUsage, getVersions, listDatabases, newDatabase, saveDatabase, saveDatabaseDestination, saveDatabaseSettings, saveDatabaseType, saveVersion, startDatabase, stopDatabase } from './handlers';
import { deleteDatabase, deleteDatabaseSecret, getDatabase, getDatabaseLogs, getDatabaseSecrets, getDatabaseStatus, getDatabaseTypes, getDatabaseUsage, getVersions, listDatabases, newDatabase, saveDatabase, saveDatabaseDestination, saveDatabaseSecret, saveDatabaseSettings, saveDatabaseType, saveVersion, startDatabase, stopDatabase } from './handlers';
import type { GetDatabaseLogs, OnlyId, SaveDatabase, SaveDatabaseDestination, SaveDatabaseSettings, SaveVersion } from '../../../../types';
import type { SaveDatabaseType } from './types';
import type { OnlyId } from '../../../../types';
import type { DeleteDatabase, SaveDatabaseType, DeleteDatabaseSecret, GetDatabaseLogs, SaveDatabase, SaveDatabaseDestination, SaveDatabaseSecret, SaveDatabaseSettings, SaveVersion } from './types';
const root: FastifyPluginAsync = async (fastify): Promise<void> => {
fastify.addHook('onRequest', async (request) => {
@@ -13,12 +14,16 @@ const root: FastifyPluginAsync = async (fastify): Promise<void> => {
fastify.get<OnlyId>('/:id', async (request) => await getDatabase(request));
fastify.post<SaveDatabase>('/:id', async (request, reply) => await saveDatabase(request, reply));
fastify.delete<OnlyId>('/:id', async (request) => await deleteDatabase(request));
fastify.delete<DeleteDatabase>('/:id', async (request) => await deleteDatabase(request));
fastify.get<OnlyId>('/:id/status', async (request) => await getDatabaseStatus(request));
fastify.post<SaveDatabaseSettings>('/:id/settings', async (request) => await saveDatabaseSettings(request));
fastify.get<OnlyId>('/:id/secrets', async (request) => await getDatabaseSecrets(request));
fastify.post<SaveDatabaseSecret>('/:id/secrets', async (request, reply) => await saveDatabaseSecret(request, reply));
fastify.delete<DeleteDatabaseSecret>('/:id/secrets', async (request) => await deleteDatabaseSecret(request));
fastify.get('/:id/configuration/type', async (request) => await getDatabaseTypes(request));
fastify.post<SaveDatabaseType>('/:id/configuration/type', async (request, reply) => await saveDatabaseType(request, reply));

View File

@@ -2,4 +2,54 @@ import type { OnlyId } from "../../../../types";
export interface SaveDatabaseType extends OnlyId {
Body: { type: string }
}
}
export interface DeleteDatabase extends OnlyId {
Body: { force: string }
}
export interface SaveVersion extends OnlyId {
Body: {
version: string
}
}
export interface SaveDatabaseDestination extends OnlyId {
Body: {
destinationId: string
}
}
export interface GetDatabaseLogs extends OnlyId {
Querystring: {
since: number
}
}
export interface SaveDatabase extends OnlyId {
Body: {
name: string,
defaultDatabase: string,
dbUser: string,
dbUserPassword: string,
rootUser: string,
rootUserPassword: string,
version: string,
isRunning: boolean
}
}
export interface SaveDatabaseSettings extends OnlyId {
Body: {
isPublic: boolean,
appendOnly: boolean
}
}
export interface SaveDatabaseSecret extends OnlyId {
Body: {
name: string,
value: string,
isNew: string,
}
}
export interface DeleteDatabaseSecret extends OnlyId {
Body: {
name: string,
}
}

View File

@@ -30,7 +30,6 @@ export async function listDestinations(request: FastifyRequest<ListDestinations>
destinations
}
} catch ({ status, message }) {
console.log({ status, message })
return errorHandler({ status, message })
}
}
@@ -114,7 +113,6 @@ export async function newDestination(request: FastifyRequest<NewDestination>, re
}
} catch ({ status, message }) {
console.log({ status, message })
return errorHandler({ status, message })
}
}
@@ -162,7 +160,6 @@ export async function startProxy(request: FastifyRequest<Proxy>) {
await startTraefikProxy(id);
return {}
} catch ({ status, message }) {
console.log({ status, message })
await stopTraefikProxy(id);
return errorHandler({ status, message })
}
@@ -205,23 +202,21 @@ export async function assignSSHKey(request: FastifyRequest) {
return errorHandler({ status, message })
}
}
export async function verifyRemoteDockerEngine(request: FastifyRequest, reply: FastifyReply) {
export async function verifyRemoteDockerEngine(request: FastifyRequest<OnlyId>, reply: FastifyReply) {
try {
const { id } = request.params;
await createRemoteEngineConfiguration(id);
const { remoteIpAddress, remoteUser, network } = await prisma.destinationDocker.findFirst({ where: { id } })
const { remoteIpAddress, remoteUser, network, isCoolifyProxyUsed } = await prisma.destinationDocker.findFirst({ where: { id } })
const host = `ssh://${remoteUser}@${remoteIpAddress}`
const { stdout } = await asyncExecShell(`DOCKER_HOST=${host} docker network ls --filter 'name=${network}' --no-trunc --format "{{json .}}"`);
if (!stdout) {
await asyncExecShell(`DOCKER_HOST=${host} docker network create --attachable ${network}`);
}
const { stdout:coolifyNetwork } = await asyncExecShell(`DOCKER_HOST=${host} docker network ls --filter 'name=coolify-infra' --no-trunc --format "{{json .}}"`);
const { stdout: coolifyNetwork } = await asyncExecShell(`DOCKER_HOST=${host} docker network ls --filter 'name=coolify-infra' --no-trunc --format "{{json .}}"`);
if (!coolifyNetwork) {
await asyncExecShell(`DOCKER_HOST=${host} docker network create --attachable coolify-infra`);
}
if (isCoolifyProxyUsed) await startTraefikProxy(id);
await prisma.destinationDocker.update({ where: { id }, data: { remoteVerified: true } })
return reply.code(201).send()
@@ -234,7 +229,7 @@ export async function getDestinationStatus(request: FastifyRequest<OnlyId>) {
try {
const { id } = request.params
const destination = await prisma.destinationDocker.findUnique({ where: { id } })
const isRunning = await checkContainer({ dockerId: destination.id, container: 'coolify-proxy' })
const isRunning = await checkContainer({ dockerId: destination.id, container: 'coolify-proxy', remove: true })
return {
isRunning
}

View File

@@ -23,7 +23,7 @@ const root: FastifyPluginAsync = async (fastify): Promise<void> => {
fastify.post('/:id/configuration/sshKey', async (request) => await assignSSHKey(request));
fastify.post('/:id/verify', async (request, reply) => await verifyRemoteDockerEngine(request, reply));
fastify.post<OnlyId>('/:id/verify', async (request, reply) => await verifyRemoteDockerEngine(request, reply));
};
export default root;

View File

@@ -1,11 +1,10 @@
import os from 'node:os';
import osu from 'node-os-utils';
import axios from 'axios';
import compare from 'compare-versions';
import { compareVersions } from 'compare-versions';
import cuid from 'cuid';
import bcrypt from 'bcryptjs';
import { asyncExecShell, asyncSleep, cleanupDockerStorage, errorHandler, isDev, listSettings, prisma, uniqueName, version } from '../../../lib/common';
import { supportedServiceTypesAndVersions } from '../../../lib/services/supportedVersions';
import type { FastifyReply, FastifyRequest } from 'fastify';
import type { Login, Update } from '.';
import type { GetCurrentUser } from './types';
@@ -15,9 +14,10 @@ export async function hashPassword(password: string): Promise<string> {
return bcrypt.hash(password, saltRounds);
}
export async function cleanupManually() {
export async function cleanupManually(request: FastifyRequest) {
try {
const destination = await prisma.destinationDocker.findFirst({ where: { engine: '/var/run/docker.sock' } })
const { serverId } = request.body;
const destination = await prisma.destinationDocker.findUnique({ where: { id: serverId } })
await cleanupDockerStorage(destination.id, true, true)
return {}
} catch ({ status, message }) {
@@ -32,7 +32,7 @@ export async function checkUpdate(request: FastifyRequest) {
`https://get.coollabs.io/versions.json?appId=${process.env['COOLIFY_APP_ID']}&version=${currentVersion}`
);
const latestVersion = versions['coolify'].main.version
const isUpdateAvailable = compare(latestVersion, currentVersion);
const isUpdateAvailable = compareVersions(latestVersion, currentVersion);
if (isStaging) {
return {
isUpdateAvailable: true,
@@ -52,9 +52,7 @@ export async function update(request: FastifyRequest<Update>) {
const { latestVersion } = request.body;
try {
if (!isDev) {
const { isAutoUpdateEnabled } = (await prisma.setting.findFirst()) || {
isAutoUpdateEnabled: false
};
const { isAutoUpdateEnabled } = await prisma.setting.findFirst();
await asyncExecShell(`docker pull coollabsio/coolify:${latestVersion}`);
await asyncExecShell(`env | grep COOLIFY > .env`);
await asyncExecShell(
@@ -65,7 +63,6 @@ export async function update(request: FastifyRequest<Update>) {
);
return {};
} else {
console.log(latestVersion);
await asyncSleep(2000);
return {};
}
@@ -73,44 +70,54 @@ export async function update(request: FastifyRequest<Update>) {
return errorHandler({ status, message })
}
}
export async function showUsage() {
export async function restartCoolify(request: FastifyRequest<any>) {
try {
return {
usage: {
uptime: os.uptime(),
memory: await osu.mem.info(),
cpu: {
load: os.loadavg(),
usage: await osu.cpu.usage(),
count: os.cpus().length
},
disk: await osu.drive.info('/')
const teamId = request.user.teamId;
if (teamId === '0') {
if (!isDev) {
asyncExecShell(`docker restart coolify`);
return {};
} else {
return {};
}
};
}
throw { status: 500, message: 'You are not authorized to restart Coolify.' };
} catch ({ status, message }) {
return errorHandler({ status, message })
}
}
export async function showDashboard(request: FastifyRequest) {
try {
const userId = request.user.userId;
const teamId = request.user.teamId;
const applications = await prisma.application.findMany({
where: { teams: { some: { id: teamId === '0' ? undefined : teamId } } },
include: { settings: true }
include: { settings: true, destinationDocker: true, teams: true }
});
const databases = await prisma.database.findMany({
where: { teams: { some: { id: teamId === '0' ? undefined : teamId } } }
where: { teams: { some: { id: teamId === '0' ? undefined : teamId } } },
include: { settings: true, destinationDocker: true, teams: true }
});
const services = await prisma.service.findMany({
where: { teams: { some: { id: teamId === '0' ? undefined : teamId } } }
where: { teams: { some: { id: teamId === '0' ? undefined : teamId } } },
include: { destinationDocker: true, teams: true }
});
const gitSources = await prisma.gitSource.findMany({
where: { teams: { some: { id: teamId === '0' ? undefined : teamId } } },
include: { teams: true }
});
const destinations = await prisma.destinationDocker.findMany({
where: { teams: { some: { id: teamId === '0' ? undefined : teamId } } },
include: { teams: true }
});
const settings = await listSettings();
return {
applications,
databases,
services,
gitSources,
destinations,
settings,
};
} catch ({ status, message }) {
@@ -282,6 +289,7 @@ export async function getCurrentUser(request: FastifyRequest<GetCurrentUser>, fa
}
return {
settings: await prisma.setting.findFirst(),
supportedServiceTypesAndVersions,
token,
...request.user
}

View File

@@ -158,8 +158,11 @@ export async function getTeam(request: FastifyRequest<OnlyId>, reply: FastifyRep
});
const team = await prisma.team.findUnique({ where: { id }, include: { permissions: true } });
const invitations = await prisma.teamInvitation.findMany({ where: { teamId: team.id } });
const { teams } = await prisma.user.findUnique({ where: { id: userId }, include: { teams: true } })
return {
currentTeam: teamId,
team,
teams,
permissions,
invitations
};
@@ -275,10 +278,10 @@ export async function inviteToTeam(request: FastifyRequest<InviteToTeam>, reply:
if (!userFound) {
throw {
message: `No user found with '${email}' email address.`
};
};
}
const uid = userFound.id;
if (uid === userId) {
if (uid === userId) {
throw {
message: `Invitation to yourself? Whaaaaat?`
};

View File

@@ -1,5 +1,5 @@
import { FastifyPluginAsync } from 'fastify';
import { checkUpdate, login, showDashboard, update, showUsage, getCurrentUser, cleanupManually } from './handlers';
import { checkUpdate, login, showDashboard, update, showUsage, getCurrentUser, cleanupManually, restartCoolify } from './handlers';
import { GetCurrentUser } from './types';
export interface Update {
@@ -43,13 +43,13 @@ const root: FastifyPluginAsync = async (fastify): Promise<void> => {
onRequest: [fastify.authenticate]
}, async (request) => await showDashboard(request));
fastify.get('/usage', {
fastify.post('/internal/restart', {
onRequest: [fastify.authenticate]
}, async () => await showUsage());
}, async (request) => await restartCoolify(request));
fastify.post('/internal/cleanup', {
onRequest: [fastify.authenticate]
}, async () => await cleanupManually());
}, async (request) => await cleanupManually(request));
};
export default root;

View File

@@ -0,0 +1,119 @@
import type { FastifyRequest } from 'fastify';
import { errorHandler, executeDockerCmd, prisma, createRemoteEngineConfiguration, executeSSHCmd } from '../../../../lib/common';
import os from 'node:os';
import osu from 'node-os-utils';
export async function listServers(request: FastifyRequest) {
try {
const userId = request.user.userId;
const teamId = request.user.teamId;
const servers = await prisma.destinationDocker.findMany({ where: { teams: { some: { id: teamId === '0' ? undefined : teamId } }, remoteEngine: false }, distinct: ['engine'] })
// const remoteServers = await prisma.destinationDocker.findMany({ where: { teams: { some: { id: teamId === '0' ? undefined : teamId } } }, distinct: ['remoteIpAddress', 'engine'] })
return {
servers
}
} catch ({ status, message }) {
return errorHandler({ status, message })
}
}
const mappingTable = [
['K total memory', 'totalMemoryKB'],
['K used memory', 'usedMemoryKB'],
['K active memory', 'activeMemoryKB'],
['K inactive memory', 'inactiveMemoryKB'],
['K free memory', 'freeMemoryKB'],
['K buffer memory', 'bufferMemoryKB'],
['K swap cache', 'swapCacheKB'],
['K total swap', 'totalSwapKB'],
['K used swap', 'usedSwapKB'],
['K free swap', 'freeSwapKB'],
['non-nice user cpu ticks', 'nonNiceUserCpuTicks'],
['nice user cpu ticks', 'niceUserCpuTicks'],
['system cpu ticks', 'systemCpuTicks'],
['idle cpu ticks', 'idleCpuTicks'],
['IO-wait cpu ticks', 'ioWaitCpuTicks'],
['IRQ cpu ticks', 'irqCpuTicks'],
['softirq cpu ticks', 'softIrqCpuTicks'],
['stolen cpu ticks', 'stolenCpuTicks'],
['pages paged in', 'pagesPagedIn'],
['pages paged out', 'pagesPagedOut'],
['pages swapped in', 'pagesSwappedIn'],
['pages swapped out', 'pagesSwappedOut'],
['interrupts', 'interrupts'],
['CPU context switches', 'cpuContextSwitches'],
['boot time', 'bootTime'],
['forks', 'forks']
];
function parseFromText(text) {
var data = {};
var lines = text.split(/\r?\n/);
for (const line of lines) {
for (const [key, value] of mappingTable) {
if (line.indexOf(key) >= 0) {
const values = line.match(/[0-9]+/)[0];
data[value] = parseInt(values, 10);
}
}
}
return data;
}
export async function showUsage(request: FastifyRequest) {
const { id } = request.params;
let { remoteEngine } = request.query
remoteEngine = remoteEngine === 'true' ? true : false
if (remoteEngine) {
const { stdout: stats } = await executeSSHCmd({ dockerId: id, command: `vmstat -s` })
const { stdout: disks } = await executeSSHCmd({ dockerId: id, command: `df -m / --output=size,used,pcent|grep -v 'Used'| xargs` })
const { stdout: cpus } = await executeSSHCmd({ dockerId: id, command: `nproc --all` })
// const { stdout: cpuUsage } = await executeSSHCmd({ dockerId: id, command: `echo $[100-$(vmstat 1 2|tail -1|awk '{print $15}')]` })
// console.log(cpuUsage)
const parsed: any = parseFromText(stats)
return {
usage: {
uptime: parsed.bootTime / 1024,
memory: {
totalMemMb: parsed.totalMemoryKB / 1024,
usedMemMb: parsed.usedMemoryKB / 1024,
freeMemMb: parsed.freeMemoryKB / 1024,
usedMemPercentage: (parsed.usedMemoryKB / parsed.totalMemoryKB) * 100,
freeMemPercentage: (parsed.totalMemoryKB - parsed.usedMemoryKB) / parsed.totalMemoryKB * 100
},
cpu: {
load: 0,
usage: 0,
count: cpus
},
disk: {
totalGb: (disks.split(' ')[0] / 1024).toFixed(1),
usedGb: (disks.split(' ')[1] / 1024).toFixed(1),
freeGb: (disks.split(' ')[0] - disks.split(' ')[1]).toFixed(1),
usedPercentage: disks.split(' ')[2].replace('%', ''),
freePercentage: 100 - disks.split(' ')[2].replace('%', '')
}
}
}
} else {
try {
return {
usage: {
uptime: os.uptime(),
memory: await osu.mem.info(),
cpu: {
load: os.loadavg(),
usage: await osu.cpu.usage(),
count: os.cpus().length
},
disk: await osu.drive.info('/')
}
};
} catch ({ status, message }) {
return errorHandler({ status, message })
}
}
}

View File

@@ -0,0 +1,14 @@
import { FastifyPluginAsync } from 'fastify';
import { listServers, showUsage } from './handlers';
const root: FastifyPluginAsync = async (fastify): Promise<void> => {
fastify.addHook('onRequest', async (request) => {
return await request.jwtVerify()
})
fastify.get('/', async (request) => await listServers(request));
fastify.get('/usage/:id', async (request) => await showUsage(request));
};
export default root;

View File

@@ -0,0 +1,27 @@
import { OnlyId } from "../../../../types"
export interface SaveTeam extends OnlyId {
Body: {
name: string
}
}
export interface InviteToTeam {
Body: {
email: string,
permission: string,
teamId: string,
teamName: string
}
}
export interface BodyId {
Body: {
id: string
}
}
export interface SetPermission {
Body: {
userId: string,
newPermission: string,
permissionId: string
}
}

File diff suppressed because it is too large Load Diff

View File

@@ -26,12 +26,11 @@ import {
saveServiceType,
saveServiceVersion,
setSettingsService,
startService,
stopService
} from './handlers';
import type { OnlyId } from '../../../../types';
import type { ActivateWordpressFtp, CheckService, CheckServiceDomain, DeleteServiceSecret, DeleteServiceStorage, GetServiceLogs, SaveService, SaveServiceDestination, SaveServiceSecret, SaveServiceSettings, SaveServiceStorage, SaveServiceType, SaveServiceVersion, ServiceStartStop, SetWordpressSettings } from './types';
import type { ActivateWordpressFtp, CheckService, CheckServiceDomain, DeleteServiceSecret, DeleteServiceStorage, GetServiceLogs, SaveService, SaveServiceDestination, SaveServiceSecret, SaveServiceSettings, SaveServiceStorage, SaveServiceType, SaveServiceVersion, ServiceStartStop, SetGlitchTipSettings, SetWordpressSettings } from './types';
import { startService, stopService } from '../../../../lib/services/handlers';
const root: FastifyPluginAsync = async (fastify): Promise<void> => {
fastify.addHook('onRequest', async (request) => {
@@ -72,7 +71,7 @@ const root: FastifyPluginAsync = async (fastify): Promise<void> => {
fastify.post<ServiceStartStop>('/:id/:type/start', async (request) => await startService(request));
fastify.post<ServiceStartStop>('/:id/:type/stop', async (request) => await stopService(request));
fastify.post<ServiceStartStop & SetWordpressSettings>('/:id/:type/settings', async (request, reply) => await setSettingsService(request, reply));
fastify.post<ServiceStartStop & SetWordpressSettings & SetGlitchTipSettings>('/:id/:type/settings', async (request, reply) => await setSettingsService(request, reply));
fastify.post<OnlyId>('/:id/plausibleanalytics/activate', async (request, reply) => await activatePlausibleUsers(request, reply));
fastify.post<OnlyId>('/:id/plausibleanalytics/cleanup', async (request, reply) => await cleanupPlausibleLogs(request, reply));

View File

@@ -89,6 +89,10 @@ export interface ActivateWordpressFtp extends OnlyId {
}
}
export interface SetGlitchTipSettings extends OnlyId {
Body: {
enableOpenUserRegistration: boolean,
emailSmtpUseSsl: boolean,
emailSmtpUseTls: boolean
}
}

View File

@@ -28,6 +28,7 @@ export async function saveSettings(request: FastifyRequest<SaveSettings>, reply:
try {
const {
fqdn,
isAPIDebuggingEnabled,
isRegistrationEnabled,
dualCerts,
minPort,
@@ -39,7 +40,7 @@ export async function saveSettings(request: FastifyRequest<SaveSettings>, reply:
const { id } = await listSettings();
await prisma.setting.update({
where: { id },
data: { isRegistrationEnabled, dualCerts, isAutoUpdateEnabled, isDNSCheckEnabled, DNSServers }
data: { isRegistrationEnabled, dualCerts, isAutoUpdateEnabled, isDNSCheckEnabled, DNSServers, isAPIDebuggingEnabled }
});
if (fqdn) {
await prisma.setting.update({ where: { id }, data: { fqdn } });
@@ -57,7 +58,7 @@ export async function deleteDomain(request: FastifyRequest<DeleteDomain>, reply:
const { fqdn } = request.body
const { DNSServers } = await listSettings();
if (DNSServers) {
dns.setServers([DNSServers]);
dns.setServers([...DNSServers.split(',')]);
}
let ip;
try {

View File

@@ -3,6 +3,7 @@ import { OnlyId } from "../../../../types"
export interface SaveSettings {
Body: {
fqdn: string,
isAPIDebuggingEnabled: boolean,
isRegistrationEnabled: boolean,
dualCerts: boolean,
minPort: number,

View File

@@ -3,8 +3,7 @@ import cuid from "cuid";
import crypto from "crypto";
import { encrypt, errorHandler, getUIUrl, isDev, prisma } from "../../../lib/common";
import { checkContainer, removeContainer } from "../../../lib/docker";
import { scheduler } from "../../../lib/scheduler";
import { getApplicationFromDBWebhook } from "../../api/v1/applications/handlers";
import { createdBranchDatabase, getApplicationFromDBWebhook, removeBranchDatabase } from "../../api/v1/applications/handlers";
import type { FastifyReply, FastifyRequest } from "fastify";
import type { GitHubEvents, InstallGithub } from "./types";
@@ -67,7 +66,6 @@ export async function configureGitHubApp(request, reply) {
}
export async function gitHubEvents(request: FastifyRequest<GitHubEvents>): Promise<any> {
try {
const buildId = cuid();
const allowedGithubEvents = ['push', 'pull_request'];
const allowedActions = ['opened', 'reopened', 'synchronize', 'closed'];
const githubEvent = request.headers['x-github-event']?.toString().toLowerCase();
@@ -87,137 +85,139 @@ export async function gitHubEvents(request: FastifyRequest<GitHubEvents>): Promi
if (!projectId || !branch) {
throw { status: 500, message: 'Cannot parse projectId or branch from the webhook?!' }
}
const applicationFound = await getApplicationFromDBWebhook(projectId, branch);
if (applicationFound) {
const webhookSecret = applicationFound.gitSource.githubApp.webhookSecret || null;
//@ts-ignore
const hmac = crypto.createHmac('sha256', webhookSecret);
const digest = Buffer.from(
'sha256=' + hmac.update(JSON.stringify(body)).digest('hex'),
'utf8'
);
if (!isDev) {
const checksum = Buffer.from(githubSignature, 'utf8');
const applicationsFound = await getApplicationFromDBWebhook(projectId, branch);
if (applicationsFound && applicationsFound.length > 0) {
for (const application of applicationsFound) {
const buildId = cuid();
const webhookSecret = application.gitSource.githubApp.webhookSecret || null;
//@ts-ignore
if (checksum.length !== digest.length || !crypto.timingSafeEqual(digest, checksum)) {
throw { status: 500, message: 'SHA256 checksum failed. Are you doing something fishy?' }
};
}
if (githubEvent === 'push') {
if (!applicationFound.configHash) {
const configHash = crypto
//@ts-ignore
.createHash('sha256')
.update(
JSON.stringify({
buildPack: applicationFound.buildPack,
port: applicationFound.port,
exposePort: applicationFound.exposePort,
installCommand: applicationFound.installCommand,
buildCommand: applicationFound.buildCommand,
startCommand: applicationFound.startCommand
})
)
.digest('hex');
await prisma.application.updateMany({
where: { branch, projectId },
data: { configHash }
});
}
await prisma.application.update({
where: { id: applicationFound.id },
data: { updatedAt: new Date() }
});
await prisma.build.create({
data: {
id: buildId,
applicationId: applicationFound.id,
destinationDockerId: applicationFound.destinationDocker.id,
gitSourceId: applicationFound.gitSource.id,
githubAppId: applicationFound.gitSource.githubApp?.id,
gitlabAppId: applicationFound.gitSource.gitlabApp?.id,
status: 'queued',
type: 'webhook_commit'
}
});
scheduler.workers.get('deployApplication').postMessage({
build_id: buildId,
type: 'webhook_commit',
...applicationFound
});
return {
message: 'Queued. Thank you!'
};
} else if (githubEvent === 'pull_request') {
const pullmergeRequestId = body.number;
const pullmergeRequestAction = body.action;
const sourceBranch = body.pull_request.head.ref.includes('/') ? body.pull_request.head.ref.split('/')[2] : body.pull_request.head.ref;
if (!allowedActions.includes(pullmergeRequestAction)) {
throw { status: 500, message: 'Action not allowed.' }
const hmac = crypto.createHmac('sha256', webhookSecret);
const digest = Buffer.from(
'sha256=' + hmac.update(JSON.stringify(body)).digest('hex'),
'utf8'
);
if (!isDev) {
const checksum = Buffer.from(githubSignature, 'utf8');
//@ts-ignore
if (checksum.length !== digest.length || !crypto.timingSafeEqual(digest, checksum)) {
throw { status: 500, message: 'SHA256 checksum failed. Are you doing something fishy?' }
};
}
if (applicationFound.settings.previews) {
if (applicationFound.destinationDockerId) {
const isRunning = await checkContainer(
{
dockerId: applicationFound.destinationDocker.id,
container: applicationFound.id
}
);
if (!isRunning) {
throw { status: 500, message: 'Application not running.' }
}
}
if (
pullmergeRequestAction === 'opened' ||
pullmergeRequestAction === 'reopened' ||
pullmergeRequestAction === 'synchronize'
) {
if (githubEvent === 'push') {
if (!application.configHash) {
const configHash = crypto
//@ts-ignore
.createHash('sha256')
.update(
JSON.stringify({
buildPack: application.buildPack,
port: application.port,
exposePort: application.exposePort,
installCommand: application.installCommand,
buildCommand: application.buildCommand,
startCommand: application.startCommand
})
)
.digest('hex');
await prisma.application.update({
where: { id: applicationFound.id },
data: { updatedAt: new Date() }
where: { id: application.id },
data: { configHash }
});
await prisma.build.create({
data: {
id: buildId,
applicationId: applicationFound.id,
destinationDockerId: applicationFound.destinationDocker.id,
gitSourceId: applicationFound.gitSource.id,
githubAppId: applicationFound.gitSource.githubApp?.id,
gitlabAppId: applicationFound.gitSource.gitlabApp?.id,
status: 'queued',
type: 'webhook_pr'
}
});
scheduler.workers.get('deployApplication').postMessage({
build_id: buildId,
type: 'webhook_pr',
...applicationFound,
sourceBranch,
pullmergeRequestId
});
return {
message: 'Queued. Thank you!'
};
} else if (pullmergeRequestAction === 'closed') {
if (applicationFound.destinationDockerId) {
const id = `${applicationFound.id}-${pullmergeRequestId}`;
await removeContainer({ id, dockerId: applicationFound.destinationDocker.id });
}
return {
message: 'Removed preview. Thank you!'
};
}
} else {
throw { status: 500, message: 'Pull request previews are not enabled.' }
await prisma.application.update({
where: { id: application.id },
data: { updatedAt: new Date() }
});
await prisma.build.create({
data: {
id: buildId,
applicationId: application.id,
destinationDockerId: application.destinationDocker.id,
gitSourceId: application.gitSource.id,
githubAppId: application.gitSource.githubApp?.id,
gitlabAppId: application.gitSource.gitlabApp?.id,
status: 'queued',
type: 'webhook_commit'
}
});
console.log(`Webhook for ${application.name} queued.`)
} else if (githubEvent === 'pull_request') {
const pullmergeRequestId = body.number.toString();
const pullmergeRequestAction = body.action;
const sourceBranch = body.pull_request.head.ref.includes('/') ? body.pull_request.head.ref.split('/')[2] : body.pull_request.head.ref;
if (!allowedActions.includes(pullmergeRequestAction)) {
throw { status: 500, message: 'Action not allowed.' }
}
if (application.settings.previews) {
if (application.destinationDockerId) {
const isRunning = await checkContainer(
{
dockerId: application.destinationDocker.id,
container: application.id
}
);
if (!isRunning) {
throw { status: 500, message: 'Application not running.' }
}
}
if (
pullmergeRequestAction === 'opened' ||
pullmergeRequestAction === 'reopened' ||
pullmergeRequestAction === 'synchronize'
) {
await prisma.application.update({
where: { id: application.id },
data: { updatedAt: new Date() }
});
// if (application.connectedDatabase && pullmergeRequestAction === 'opened' || pullmergeRequestAction === 'reopened') {
// // Coolify hosted database
// if (application.connectedDatabase.databaseId) {
// const databaseId = application.connectedDatabase.databaseId;
// const database = await prisma.database.findUnique({ where: { id: databaseId } });
// if (database) {
// await createdBranchDatabase(database, application.connectedDatabase.hostedDatabaseDBName, pullmergeRequestId);
// }
// }
// }
await prisma.build.create({
data: {
id: buildId,
pullmergeRequestId,
sourceBranch,
applicationId: application.id,
destinationDockerId: application.destinationDocker.id,
gitSourceId: application.gitSource.id,
githubAppId: application.gitSource.githubApp?.id,
gitlabAppId: application.gitSource.gitlabApp?.id,
status: 'queued',
type: 'webhook_pr'
}
});
} else if (pullmergeRequestAction === 'closed') {
if (application.destinationDockerId) {
const id = `${application.id}-${pullmergeRequestId}`;
try {
await removeContainer({ id, dockerId: application.destinationDocker.id });
} catch (error) { }
}
if (application.connectedDatabase.databaseId) {
const databaseId = application.connectedDatabase.databaseId;
const database = await prisma.database.findUnique({ where: { id: databaseId } });
if (database) {
await removeBranchDatabase(database, pullmergeRequestId);
}
}
}
}
}
}
}
throw { status: 500, message: 'Not handled event.' }
} catch ({ status, message }) {
return errorHandler({ status, message })
}

View File

@@ -2,9 +2,8 @@ import axios from "axios";
import cuid from "cuid";
import crypto from "crypto";
import type { FastifyReply, FastifyRequest } from "fastify";
import { errorHandler, getAPIUrl, isDev, listSettings, prisma } from "../../../lib/common";
import { errorHandler, getAPIUrl, getUIUrl, isDev, listSettings, prisma } from "../../../lib/common";
import { checkContainer, removeContainer } from "../../../lib/docker";
import { scheduler } from "../../../lib/scheduler";
import { getApplicationFromDB, getApplicationFromDBWebhook } from "../../api/v1/applications/handlers";
import type { ConfigureGitLabApp, GitLabEvents } from "./types";
@@ -30,7 +29,7 @@ export async function configureGitLabApp(request: FastifyRequest<ConfigureGitLab
});
const { data } = await axios.post(`${htmlUrl}/oauth/token`, params)
if (isDev) {
return reply.redirect(`${getAPIUrl()}/webhooks/success?token=${data.access_token}`)
return reply.redirect(`${getUIUrl()}/webhooks/success?token=${data.access_token}`)
}
return reply.redirect(`/webhooks/success?token=${data.access_token}`)
} catch ({ status, message, ...other }) {
@@ -40,65 +39,56 @@ export async function configureGitLabApp(request: FastifyRequest<ConfigureGitLab
export async function gitLabEvents(request: FastifyRequest<GitLabEvents>) {
const { object_kind: objectKind, ref, project_id } = request.body
try {
const buildId = cuid();
const allowedActions = ['opened', 'reopen', 'close', 'open', 'update'];
const webhookToken = request.headers['x-gitlab-token'];
if (!webhookToken) {
if (!webhookToken && !isDev) {
throw { status: 500, message: 'Invalid webhookToken.' }
}
if (objectKind === 'push') {
const projectId = Number(project_id);
const branch = ref.split('/')[2];
const applicationFound = await getApplicationFromDBWebhook(projectId, branch);
if (applicationFound) {
if (!applicationFound.configHash) {
const configHash = crypto
.createHash('sha256')
.update(
JSON.stringify({
buildPack: applicationFound.buildPack,
port: applicationFound.port,
exposePort: applicationFound.exposePort,
installCommand: applicationFound.installCommand,
buildCommand: applicationFound.buildCommand,
startCommand: applicationFound.startCommand
})
)
.digest('hex');
await prisma.application.updateMany({
where: { branch, projectId },
data: { configHash }
const applicationsFound = await getApplicationFromDBWebhook(projectId, branch);
if (applicationsFound && applicationsFound.length > 0) {
for (const application of applicationsFound) {
const buildId = cuid();
if (!application.configHash) {
const configHash = crypto
.createHash('sha256')
.update(
JSON.stringify({
buildPack: application.buildPack,
port: application.port,
exposePort: application.exposePort,
installCommand: application.installCommand,
buildCommand: application.buildCommand,
startCommand: application.startCommand
})
)
.digest('hex');
await prisma.application.update({
where: { id: application.id },
data: { configHash }
});
}
await prisma.application.update({
where: { id: application.id },
data: { updatedAt: new Date() }
});
await prisma.build.create({
data: {
id: buildId,
applicationId: application.id,
destinationDockerId: application.destinationDocker.id,
gitSourceId: application.gitSource.id,
githubAppId: application.gitSource.githubApp?.id,
gitlabAppId: application.gitSource.gitlabApp?.id,
status: 'queued',
type: 'webhook_commit'
}
});
}
await prisma.application.update({
where: { id: applicationFound.id },
data: { updatedAt: new Date() }
});
await prisma.build.create({
data: {
id: buildId,
applicationId: applicationFound.id,
destinationDockerId: applicationFound.destinationDocker.id,
gitSourceId: applicationFound.gitSource.id,
githubAppId: applicationFound.gitSource.githubApp?.id,
gitlabAppId: applicationFound.gitSource.gitlabApp?.id,
status: 'queued',
type: 'webhook_commit'
}
});
scheduler.workers.get('deployApplication').postMessage({
build_id: buildId,
type: 'webhook_commit',
...applicationFound
});
return {
message: 'Queued. Thank you!'
};
}
} else if (objectKind === 'merge_request') {
const { object_attributes: { work_in_progress: isDraft, action, source_branch: sourceBranch, target_branch: targetBranch, iid: pullmergeRequestId }, project: { id } } = request.body
@@ -111,71 +101,63 @@ export async function gitLabEvents(request: FastifyRequest<GitLabEvents>) {
throw { status: 500, message: 'Draft MR, do nothing.' }
}
const applicationFound = await getApplicationFromDBWebhook(projectId, targetBranch);
if (applicationFound) {
if (applicationFound.settings.previews) {
if (applicationFound.destinationDockerId) {
const isRunning = await checkContainer(
{
dockerId: applicationFound.destinationDocker.id,
container: applicationFound.id
const applicationsFound = await getApplicationFromDBWebhook(projectId, targetBranch);
if (applicationsFound && applicationsFound.length > 0) {
for (const application of applicationsFound) {
const buildId = cuid();
if (application.settings.previews) {
if (application.destinationDockerId) {
const isRunning = await checkContainer(
{
dockerId: application.destinationDocker.id,
container: application.id
}
);
if (!isRunning) {
throw { status: 500, message: 'Application not running.' }
}
);
if (!isRunning) {
throw { status: 500, message: 'Application not running.' }
}
}
if (!isDev && applicationFound.gitSource.gitlabApp.webhookToken !== webhookToken) {
throw { status: 500, message: 'Invalid webhookToken. Are you doing something nasty?!' }
}
if (
action === 'opened' ||
action === 'reopen' ||
action === 'open' ||
action === 'update'
) {
await prisma.application.update({
where: { id: applicationFound.id },
data: { updatedAt: new Date() }
});
await prisma.build.create({
data: {
id: buildId,
applicationId: applicationFound.id,
destinationDockerId: applicationFound.destinationDocker.id,
gitSourceId: applicationFound.gitSource.id,
githubAppId: applicationFound.gitSource.githubApp?.id,
gitlabAppId: applicationFound.gitSource.gitlabApp?.id,
status: 'queued',
type: 'webhook_mr'
if (!isDev && application.gitSource.gitlabApp.webhookToken !== webhookToken) {
throw { status: 500, message: 'Invalid webhookToken. Are you doing something nasty?!' }
}
if (
action === 'opened' ||
action === 'reopen' ||
action === 'open' ||
action === 'update'
) {
await prisma.application.update({
where: { id: application.id },
data: { updatedAt: new Date() }
});
await prisma.build.create({
data: {
id: buildId,
pullmergeRequestId: pullmergeRequestId.toString(),
sourceBranch,
applicationId: application.id,
destinationDockerId: application.destinationDocker.id,
gitSourceId: application.gitSource.id,
githubAppId: application.gitSource.githubApp?.id,
gitlabAppId: application.gitSource.gitlabApp?.id,
status: 'queued',
type: 'webhook_mr'
}
});
return {
message: 'Queued. Thank you!'
};
} else if (action === 'close') {
if (application.destinationDockerId) {
const id = `${application.id}-${pullmergeRequestId}`;
await removeContainer({ id, dockerId: application.destinationDocker.id });
}
});
scheduler.workers.get('deployApplication').postMessage({
build_id: buildId,
type: 'webhook_mr',
...applicationFound,
sourceBranch,
pullmergeRequestId
});
return {
message: 'Queued. Thank you!'
};
} else if (action === 'close') {
if (applicationFound.destinationDockerId) {
const id = `${applicationFound.id}-${pullmergeRequestId}`;
const engine = applicationFound.destinationDocker.engine;
await removeContainer({ id, dockerId: applicationFound.destinationDocker.id });
}
return {
message: 'Removed preview. Thank you!'
};
}
}
throw { status: 500, message: 'Merge request previews are not enabled.' }
}
}
throw { status: 500, message: 'Not handled event.' }
} catch ({ status, message }) {
return errorHandler({ status, message })
}

View File

@@ -1,6 +1,9 @@
import { FastifyRequest } from "fastify";
import { errorHandler, getDomain, isDev, prisma, supportedServiceTypesAndVersions, include, executeDockerCmd } from "../../../lib/common";
import { errorHandler, getDomain, isDev, prisma, executeDockerCmd } from "../../../lib/common";
import { supportedServiceTypesAndVersions } from "../../../lib/services/supportedVersions";
import { includeServices } from "../../../lib/services/common";
import { TraefikOtherConfiguration } from "./types";
import { OnlyId } from "../../../types";
function configureMiddleware(
{ id, container, port, domain, nakedDomain, isHttps, isWWW, isDualCerts, scriptName, type },
@@ -23,7 +26,30 @@ function configureMiddleware(
]
}
};
if (type === 'appwrite') {
traefik.http.routers[`${id}-realtime`] = {
entrypoints: ['websecure'],
rule: `(Host(\`${nakedDomain}\`) || Host(\`www.${nakedDomain}\`)) && PathPrefix(\`/v1/realtime\`)`,
service: `${`${id}-realtime`}`,
tls: {
domains: {
main: `${domain}`
}
},
middlewares: []
};
traefik.http.services[`${id}-realtime`] = {
loadbalancer: {
servers: [
{
url: `http://${container}-realtime:${port}`
}
]
}
};
}
if (isDualCerts) {
traefik.http.routers[`${id}-secure`] = {
entrypoints: ['websecure'],
@@ -110,6 +136,23 @@ function configureMiddleware(
]
}
};
if (type === 'appwrite') {
traefik.http.routers[`${id}-realtime`] = {
entrypoints: ['web'],
rule: `(Host(\`${nakedDomain}\`) || Host(\`www.${nakedDomain}\`)) && PathPrefix(\`/v1/realtime\`)`,
service: `${id}-realtime`,
middlewares: []
};
traefik.http.services[`${id}-realtime`] = {
loadbalancer: {
servers: [
{
url: `http://${container}-realtime:${port}`
}
]
}
};
}
if (!isDualCerts) {
if (isWWW) {
@@ -234,7 +277,7 @@ export async function traefikConfiguration(request, reply) {
}
const services: any = await prisma.service.findMany({
where: { destinationDocker: { remoteEngine: false } },
include,
include: includeServices,
orderBy: { createdAt: 'desc' },
});
@@ -488,7 +531,7 @@ export async function traefikOtherConfiguration(request: FastifyRequest<TraefikO
}
}
export async function remoteTraefikConfiguration(request: FastifyRequest) {
export async function remoteTraefikConfiguration(request: FastifyRequest<OnlyId>) {
const { id } = request.params
try {
const traefik = {
@@ -590,7 +633,7 @@ export async function remoteTraefikConfiguration(request: FastifyRequest) {
}
const services: any = await prisma.service.findMany({
where: { destinationDocker: { id } },
include,
include: includeServices,
orderBy: { createdAt: 'desc' }
});

View File

@@ -1,4 +1,5 @@
import { FastifyPluginAsync } from 'fastify';
import { OnlyId } from '../../../types';
import { remoteTraefikConfiguration, traefikConfiguration, traefikOtherConfiguration } from './handlers';
import { TraefikOtherConfiguration } from './types';
@@ -6,7 +7,7 @@ const root: FastifyPluginAsync = async (fastify): Promise<void> => {
fastify.get('/main.json', async (request, reply) => traefikConfiguration(request, reply));
fastify.get<TraefikOtherConfiguration>('/other.json', async (request, reply) => traefikOtherConfiguration(request));
fastify.get('/remote/:id', async (request) => remoteTraefikConfiguration(request));
fastify.get<OnlyId>('/remote/:id', async (request) => remoteTraefikConfiguration(request));
};
export default root;

View File

@@ -1,39 +1,4 @@
export interface OnlyId {
Params: { id: string },
}
export interface SaveVersion extends OnlyId {
Body: {
version: string
}
}
export interface SaveDatabaseDestination extends OnlyId {
Body: {
destinationId: string
}
}
export interface GetDatabaseLogs extends OnlyId {
Querystring: {
since: number
}
}
export interface SaveDatabase extends OnlyId {
Body: {
name: string,
defaultDatabase: string,
dbUser: string,
dbUserPassword: string,
rootUser: string,
rootUserPassword: string,
version: string,
isRunning: boolean
}
}
export interface SaveDatabaseSettings extends OnlyId {
Body: {
isPublic: boolean,
appendOnly: boolean
}
}

4
apps/i18n/.env.example Normal file
View File

@@ -0,0 +1,4 @@
WEBLATE_INSTANCE_URL=http://localhost
WEBLATE_COMPONENT_NAME=coolify
WEBLATE_TOKEN=
TRANSLATION_DIR=

1
apps/i18n/.gitignore vendored Normal file
View File

@@ -0,0 +1 @@
locales/*

63
apps/i18n/index.mjs Normal file
View File

@@ -0,0 +1,63 @@
import dotenv from 'dotenv';
dotenv.config()
import fs from 'fs'
import path from 'path'
import { fileURLToPath } from 'url';
import Gettext from 'node-gettext'
import { po } from 'gettext-parser'
import got from 'got';
const __filename = fileURLToPath(import.meta.url);
const __dirname = path.dirname(__filename);
const weblateInstanceURL = process.env.WEBLATE_INSTANCE_URL;
const weblateComponentName = process.env.WEBLATE_COMPONENT_NAME
const token = process.env.WEBLATE_TOKEN;
const translationsDir = process.env.TRANSLATION_DIR;
const translationsPODir = './locales';
const locales = []
const domain = 'locale'
const translations = await got(`${weblateInstanceURL}/api/components/${weblateComponentName}/glossary/translations/?format=json`, {
headers: {
"Authorization": `Token ${token}`
}
}).json()
for (const translation of translations.results) {
const code = translation.language_code
locales.push(code)
const fileUrl = translation.file_url.replace('=json', '=po')
const file = await got(fileUrl, {
headers: {
"Authorization": `Token ${token}`
}
}).text()
fs.writeFileSync(path.join(__dirname, translationsPODir, domain + '-' + code + '.po'), file)
}
const gt = new Gettext()
locales.forEach((locale) => {
let json = {}
const fileName = `${domain}-${locale}.po`
const translationsFilePath = path.join(translationsPODir, fileName)
const translationsContent = fs.readFileSync(translationsFilePath)
const parsedTranslations = po.parse(translationsContent)
const a = gt.gettext(parsedTranslations)
for (const [key, value] of Object.entries(a)) {
if (key === 'translations') {
for (const [key1, value1] of Object.entries(value)) {
if (key1 !== '') {
for (const [key2, value2] of Object.entries(value1)) {
json[value2.msgctxt] = value2.msgstr[0]
}
}
}
}
}
fs.writeFileSync(`${translationsDir}/${locale}.json`, JSON.stringify(json))
})

15
apps/i18n/package.json Normal file
View File

@@ -0,0 +1,15 @@
{
"name": "i18n-converter",
"description": "Convert Weblate translations to sveltekit-i18n",
"license": "Apache-2.0",
"scripts": {
"translate": "node index.mjs"
},
"type": "module",
"dependencies": {
"node-gettext": "3.0.0",
"gettext-parser": "6.0.0",
"got": "12.3.1",
"dotenv": "16.0.2"
}
}

View File

@@ -14,32 +14,38 @@
"format": "prettier --write --plugin-search-dir=. ."
},
"devDependencies": {
"@playwright/test": "1.24.2",
"@floating-ui/dom": "1.0.1",
"@playwright/test": "1.25.1",
"@popperjs/core": "2.11.6",
"@sveltejs/kit": "1.0.0-next.405",
"@types/js-cookie": "3.0.2",
"@typescript-eslint/eslint-plugin": "5.33.0",
"@typescript-eslint/parser": "5.33.0",
"@typescript-eslint/eslint-plugin": "5.36.1",
"@typescript-eslint/parser": "5.36.1",
"autoprefixer": "10.4.8",
"eslint": "8.21.0",
"classnames": "2.3.1",
"eslint": "8.23.0",
"eslint-config-prettier": "8.5.0",
"eslint-plugin-svelte3": "4.0.0",
"flowbite": "1.5.2",
"flowbite-svelte": "0.26.2",
"postcss": "8.4.16",
"prettier": "2.7.1",
"prettier-plugin-svelte": "2.7.0",
"svelte": "3.49.0",
"svelte-check": "2.8.0",
"svelte": "3.50.0",
"svelte-check": "2.9.0",
"svelte-preprocess": "4.10.7",
"tailwindcss": "3.1.8",
"tailwindcss-scrollbar": "0.1.0",
"tslib": "2.4.0",
"typescript": "4.7.4",
"vite": "3.0.5"
"typescript": "4.8.2",
"vite": "3.1.0"
},
"type": "module",
"dependencies": {
"@sveltejs/adapter-static": "1.0.0-next.39",
"@tailwindcss/typography": "^0.5.7",
"cuid": "2.1.8",
"daisyui": "2.22.0",
"daisyui": "2.24.2",
"js-cookie": "3.0.1",
"p-limit": "4.0.0",
"svelte-select": "4.4.7",

View File

@@ -1,188 +1,5 @@
import { addToast } from '$lib/store';
export const supportedServiceTypesAndVersions = [
{
name: 'plausibleanalytics',
fancyName: 'Plausible Analytics',
baseImage: 'plausible/analytics',
images: ['bitnami/postgresql:13.2.0', 'yandex/clickhouse-server:21.3.2.5'],
versions: ['latest', 'stable'],
recommendedVersion: 'stable',
ports: {
main: 8000
}
},
{
name: 'nocodb',
fancyName: 'NocoDB',
baseImage: 'nocodb/nocodb',
versions: ['latest'],
recommendedVersion: 'latest',
ports: {
main: 8080
}
},
{
name: 'minio',
fancyName: 'MinIO',
baseImage: 'minio/minio',
versions: ['latest'],
recommendedVersion: 'latest',
ports: {
main: 9001
}
},
{
name: 'vscodeserver',
fancyName: 'VSCode Server',
baseImage: 'codercom/code-server',
versions: ['latest'],
recommendedVersion: 'latest',
ports: {
main: 8080
}
},
{
name: 'wordpress',
fancyName: 'Wordpress',
baseImage: 'wordpress',
images: ['bitnami/mysql:5.7'],
versions: ['latest', 'php8.1', 'php8.0', 'php7.4', 'php7.3'],
recommendedVersion: 'latest',
ports: {
main: 80
}
},
{
name: 'vaultwarden',
fancyName: 'Vaultwarden',
baseImage: 'vaultwarden/server',
versions: ['latest'],
recommendedVersion: 'latest',
ports: {
main: 80
}
},
{
name: 'languagetool',
fancyName: 'LanguageTool',
baseImage: 'silviof/docker-languagetool',
versions: ['latest'],
recommendedVersion: 'latest',
ports: {
main: 8010
}
},
{
name: 'n8n',
fancyName: 'n8n',
baseImage: 'n8nio/n8n',
versions: ['latest'],
recommendedVersion: 'latest',
ports: {
main: 5678
}
},
{
name: 'uptimekuma',
fancyName: 'Uptime Kuma',
baseImage: 'louislam/uptime-kuma',
versions: ['latest'],
recommendedVersion: 'latest',
ports: {
main: 3001
}
},
{
name: 'ghost',
fancyName: 'Ghost',
baseImage: 'bitnami/ghost',
images: ['bitnami/mariadb'],
versions: ['latest'],
recommendedVersion: 'latest',
ports: {
main: 2368
}
},
{
name: 'meilisearch',
fancyName: 'Meilisearch',
baseImage: 'getmeili/meilisearch',
images: [],
versions: ['latest'],
recommendedVersion: 'latest',
ports: {
main: 7700
}
},
{
name: 'umami',
fancyName: 'Umami',
baseImage: 'ghcr.io/mikecao/umami',
images: ['postgres:12-alpine'],
versions: ['postgresql-latest'],
recommendedVersion: 'postgresql-latest',
ports: {
main: 3000
}
},
{
name: 'hasura',
fancyName: 'Hasura',
baseImage: 'hasura/graphql-engine',
images: ['postgres:12-alpine'],
versions: ['latest', 'v2.10.0', 'v2.5.1'],
recommendedVersion: 'v2.10.0',
ports: {
main: 8080
}
},
{
name: 'fider',
fancyName: 'Fider',
baseImage: 'getfider/fider',
images: ['postgres:12-alpine'],
versions: ['stable'],
recommendedVersion: 'stable',
ports: {
main: 3000
}
},
{
name: 'appwrite',
fancyName: 'Appwrite',
baseImage: 'appwrite/appwrite',
images: ['mariadb:10.7', 'redis:6.2-alpine', 'appwrite/telegraf:1.4.0'],
versions: ['latest', '0.15.3'],
recommendedVersion: '0.15.3',
ports: {
main: 80
}
},
// {
// name: 'moodle',
// fancyName: 'Moodle',
// baseImage: 'bitnami/moodle',
// images: [],
// versions: ['latest', 'v4.0.2'],
// recommendedVersion: 'latest',
// ports: {
// main: 8080
// }
// }
{
name: 'glitchTip',
fancyName: 'GlitchTip',
baseImage: 'glitchtip/glitchtip',
images: ['postgres:14-alpine', 'redis:7-alpine'],
versions: ['latest'],
recommendedVersion: 'latest',
ports: {
main: 8000
}
},
];
export const asyncSleep = (delay: number) =>
new Promise((resolve) => setTimeout(resolve, delay));
@@ -254,15 +71,6 @@ export function changeQueryParams(buildId: string) {
return history.pushState(null, null, '?' + queryParams.toString());
}
export const getServiceMainPort = (service: string) => {
const serviceType = supportedServiceTypesAndVersions.find((s) => s.name === service);
if (serviceType) {
return serviceType.ports.main;
}
return null;
};
export function handlerNotFoundLoad(error: any, url: URL) {
if (error?.status === 404) {

View File

@@ -0,0 +1,32 @@
<script lang="ts">
import Tooltip from './Tooltip.svelte';
export let url = 'https://docs.coollabs.io';
let id =
'cool-' +
url
.split('')
.map((c) => c.charCodeAt(0).toString(16).padStart(2, '0'))
.join('')
.slice(-16);
</script>
<a {id} href={url} target="_blank" class="icons inline-block text-pink-500 cursor-pointer text-xs">
<svg
xmlns="http://www.w3.org/2000/svg"
class="w-6 h-6"
viewBox="0 0 24 24"
stroke-width="1.5"
stroke="currentColor"
fill="none"
stroke-linecap="round"
stroke-linejoin="round"
>
<path stroke="none" d="M0 0h24v24H0z" fill="none" />
<path
d="M6 4h11a2 2 0 0 1 2 2v12a2 2 0 0 1 -2 2h-11a1 1 0 0 1 -1 -1v-14a1 1 0 0 1 1 -1m3 0v18"
/>
<line x1="13" y1="8" x2="15" y2="8" />
<line x1="13" y1="12" x2="15" y2="12" />
</svg>
</a>
<Tooltip triggeredBy={`#${id}`}>See details in the documentation</Tooltip>

View File

@@ -1,6 +1,31 @@
<script lang="ts">
export let text: string;
export let customClass = 'max-w-[24rem]';
import { onMount } from 'svelte';
import Tooltip from './Tooltip.svelte';
export let explanation = '';
let id: any;
let self: any;
onMount(() => {
id = `info-${self.offsetLeft}-${self.offsetTop}`;
});
</script>
<div class="p-2 text-xs text-stone-400 {customClass}">{@html text}</div>
<div {id} class="inline-block mx-2 text-pink-500 cursor-pointer" bind:this={self}>
<svg
fill="none"
height="18"
shape-rendering="geometricPrecision"
stroke="currentColor"
stroke-linecap="round"
stroke-linejoin="round"
stroke-width="1.5"
viewBox="0 0 24 24"
width="18"
><path d="M12 22c5.523 0 10-4.477 10-10S17.523 2 12 2 2 6.477 2 12s4.477 10 10 10z" /><path
d="M9.09 9a3 3 0 015.83 1c0 2-3 3-3 3"
/><circle cx="12" cy="17" r=".5" />
</svg>
</div>
{#if id}
<Tooltip triggeredBy={`#${id}`}>{@html explanation}</Tooltip>
{/if}

View File

@@ -1,6 +1,8 @@
<script lang="ts">
import Explainer from '$lib/components/Explainer.svelte';
import Explaner from './Explainer.svelte';
import Tooltip from './Tooltip.svelte';
export let id: any;
export let setting: any;
export let title: any;
export let description: any;
@@ -8,22 +10,17 @@
export let disabled = false;
export let dataTooltip: any = null;
export let loading = false;
let triggeredBy = `#${id}`;
</script>
<div class="flex items-center py-4 pr-8">
<div class="flex w-96 flex-col">
<div class="text-xs font-bold text-stone-100 md:text-base">{title}</div>
<Explainer text={description} />
<div class="text-xs font-bold text-stone-100 md:text-base">
{title}<Explaner explanation={description} />
</div>
</div>
</div>
<div
class:tooltip-right={dataTooltip}
class:tooltip-primary={dataTooltip}
class:tooltip={dataTooltip}
class:text-center={isCenter}
data-tip={dataTooltip}
class="flex justify-center"
>
<div class:text-center={isCenter} class="flex justify-center">
<div
on:click
aria-pressed="false"
@@ -32,6 +29,7 @@
class:bg-green-600={!loading && setting}
class:bg-stone-700={!loading && !setting}
class:bg-yellow-500={loading}
{id}
>
<span class="sr-only">Use setting</span>
<span
@@ -72,3 +70,7 @@
</span>
</div>
</div>
{#if dataTooltip}
<Tooltip {triggeredBy} placement="top">{dataTooltip}</Tooltip>
{/if}

View File

@@ -0,0 +1,6 @@
<script lang="ts">
export let text: string;
export let customClass = 'max-w-[24rem]';
</script>
<div class="p-2 text-xs text-stone-400 {customClass}">{@html text}</div>

View File

@@ -0,0 +1,8 @@
<script lang="ts">
import { Tooltip } from 'flowbite-svelte';
export let placement = 'bottom';
export let color = 'bg-coollabs text-left';
export let triggeredBy = '#tooltip-default';
</script>
<Tooltip {triggeredBy} {placement} arrow={false} {color} style="custom"><slot /></Tooltip>

View File

@@ -4,6 +4,7 @@
import { addToast, appSession, features } from '$lib/store';
import { asyncSleep, errorNotification } from '$lib/common';
import { onMount } from 'svelte';
import Tooltip from './Tooltip.svelte';
let isUpdateAvailable = false;
let updateStatus: any = {
@@ -16,7 +17,6 @@
updateStatus.loading = true;
try {
if (dev) {
console.log(`updating to ${latestVersion}`);
await asyncSleep(4000);
return window.location.reload();
} else {
@@ -76,14 +76,14 @@
});
</script>
<div class="flex flex-col space-y-4 py-2">
<div class="py-2">
{#if $appSession.teamId === '0'}
{#if isUpdateAvailable}
<button
id="update"
disabled={updateStatus.success === false}
on:click={update}
class="icons tooltip tooltip-right tooltip-primary bg-gradient-to-r from-purple-500 via-pink-500 to-red-500 text-white duration-75 hover:scale-105"
data-tip="Update Available!"
class="icons bg-gradient-to-r from-purple-500 via-pink-500 to-red-500 text-white duration-75 hover:scale-105"
>
{#if updateStatus.loading}
<svg
@@ -184,6 +184,7 @@
>
{/if}
</button>
<Tooltip triggeredBy="#update" placement="right" color="bg-gradient-to-r from-purple-500 via-pink-500 to-red-500">New Version Available!</Tooltip>
{/if}
{/if}
</div>

View File

@@ -1,4 +1,5 @@
<script lang="ts">
export let server: any;
let usage = {
cpu: {
load: [0, 0, 0],
@@ -22,17 +23,18 @@
usage: false,
cleanup: false
};
import { appSession } from '$lib/store';
import { addToast, appSession } from '$lib/store';
import { onDestroy, onMount } from 'svelte';
import { get } from '$lib/api';
import { get, post } from '$lib/api';
import { errorNotification } from '$lib/common';
async function getStatus() {
if (loading.usage) return;
loading.usage = true;
const data = await get('/usage');
const data = await get(`/servers/usage/${server.id}?remoteEngine=${server.remoteEngine}`);
usage = data.usage;
loading.usage = false;
}
onDestroy(() => {
clearInterval(usageInterval);
});
@@ -48,65 +50,124 @@
return errorNotification(error);
}
});
async function manuallyCleanupStorage() {
try {
loading.cleanup = true;
await post('/internal/cleanup', { serverId: server.id });
return addToast({
message: 'Cleanup done.',
type: 'success'
});
} catch (error) {
return errorNotification(error);
} finally {
loading.cleanup = false;
}
}
</script>
<div class="pb-4">
<div class="title">Hardware Details</div>
<div class="text-center p-8 ">
<div>
<div class="stat w-64">
<div class="w-full relative p-5 ">
{#if loading.usage}
<span class="indicator-item badge bg-yellow-500 badge-sm" />
{:else}
<span class="indicator-item badge bg-success badge-sm" />
{/if}
{#if server.remoteEngine}
<div
class="absolute top-0 right-0 text-xl font-bold uppercase bg-gradient-to-r from-purple-500 via-pink-500 to-red-500 p-1 rounded m-2"
>
BETA
</div>
{/if}
<div class="w-full flex flex-row space-x-4">
<div class="flex flex-col">
<h1 class="font-bold text-lg lg:text-xl truncate">
{server.name}
</h1>
<div class="text-xs ">
{#if server?.remoteIpAddress}
<h2>{server?.remoteIpAddress}</h2>
{:else}
<h2>localhost</h2>
{/if}
</div>
</div>
{#if $appSession.teamId === '0'}
<button
on:click={manuallyCleanupStorage}
class:loading={loading.cleanup}
class="btn btn-sm bg-coollabs">Cleanup Storage</button
>
{/if}
</div>
<div class="flex lg:flex-row flex-col gap-4">
<div class="flex lg:flex-row flex-col space-x-0 lg:space-x-2 space-y-2 lg:space-y-0" />
</div>
<div class="divider" />
<div class="grid grid-flow-col gap-4 grid-rows-3 justify-start lg:justify-center lg:grid-rows-1">
<div class="stats stats-vertical min-w-[16rem] mb-5 rounded bg-transparent">
<div class="stat">
<div class="stat-title">Total Memory</div>
<div class="stat-value">
{(usage?.memory.totalMemMb).toFixed(0)}<span class="text-sm">MB</span>
<div class="stat-value text-2xl">
{(usage?.memory?.totalMemMb).toFixed(0)}<span class="text-sm">MB</span>
</div>
</div>
<div class="stat w-64">
<div class="stat">
<div class="stat-title">Used Memory</div>
<div class="stat-value">
{(usage?.memory.usedMemMb).toFixed(0)}<span class="text-sm">MB</span>
<div class="stat-value text-2xl">
{(usage?.memory?.usedMemMb).toFixed(0)}<span class="text-sm">MB</span>
</div>
</div>
<div class="stat w-64">
<div class="stat">
<div class="stat-title">Free Memory</div>
<div class="stat-value">
{usage?.memory.freeMemPercentage}<span class="text-sm">%</span>
<div class="stat-value text-2xl">
{(usage?.memory?.freeMemPercentage).toFixed(0)}<span class="text-sm">%</span>
</div>
</div>
</div>
<div class="py-10">
<div class="stat w-64">
<div class="stat-title">Total CPUs</div>
<div class="stat-value">
{usage?.cpu.count}
<div class="stats stats-vertical min-w-[20rem] mb-5 bg-transparent rounded">
<div class="stat">
<div class="stat-title">Total CPU</div>
<div class="stat-value text-2xl">
{usage?.cpu?.count}
</div>
</div>
<div class="stat w-64">
<div class="stat">
<div class="stat-title">CPU Usage</div>
<div class="stat-value">
{usage?.cpu.usage}<span class="text-sm">%</span>
<div class="stat-value text-2xl">
{usage?.cpu?.usage}<span class="text-sm">%</span>
</div>
</div>
<div class="stat w-64">
<div class="stat">
<div class="stat-title">Load Average (5,10,30mins)</div>
<div class="stat-value">{usage?.cpu.load}</div>
<div class="stat-value text-2xl">{usage?.cpu?.load}</div>
</div>
</div>
<div>
<div class="stat w-64">
<div class="stats stats-vertical min-w-[16rem] mb-5 bg-transparent rounded">
<div class="stat">
<div class="stat-title">Total Disk</div>
<div class="stat-value">
{usage?.disk.totalGb}<span class="text-sm">GB</span>
<div class="stat-value text-2xl">
{usage?.disk?.totalGb}<span class="text-sm">GB</span>
</div>
</div>
<div class="stat w-64">
<div class="stat">
<div class="stat-title">Used Disk</div>
<div class="stat-value">
{usage?.disk.usedGb}<span class="text-sm">GB</span>
<div class="stat-value text-2xl">
{usage?.disk?.usedGb}<span class="text-sm">GB</span>
</div>
</div>
<div class="stat w-64">
<div class="stat">
<div class="stat-title">Free Disk</div>
<div class="stat-value">{usage?.disk.freePercentage}<span class="text-sm">%</span></div>
<div class="stat-value text-2xl">
{usage?.disk?.freePercentage}<span class="text-sm">%</span>
</div>
</div>
</div>
</div>

View File

@@ -2,79 +2,8 @@
export let isAbsolute = true;
</script>
<svg
viewBox="0 0 128 128"
class={isAbsolute ? 'absolute top-0 left-0 -m-8 h-16 w-16' : 'mx-auto w-8 h-8'}
>
<g
><path
fill-rule="evenodd"
clip-rule="evenodd"
fill="#3A4D54"
d="M73.8 50.8h11.3v11.5h5.7c2.6 0 5.3-.5 7.8-1.3 1.2-.4 2.6-1 3.8-1.7-1.6-2.1-2.4-4.7-2.6-7.3-.3-3.5.4-8.1 2.8-10.8l1.2-1.4 1.4 1.1c3.6 2.9 6.5 6.8 7.1 11.4 4.3-1.3 9.3-1 13.1 1.2l1.5.9-.8 1.6c-3.2 6.2-9.9 8.2-16.4 7.8-9.8 24.3-31 35.8-56.8 35.8-13.3 0-25.5-5-32.5-16.8l-.1-.2-1-2.1c-2.4-5.2-3.1-10.9-2.6-16.6l.2-1.7h9.6v-11.4h11.3v-11.2h22.5v-11.3h13.5v22.5z"
/><path
fill="#00AADA"
d="M110.4 55.1c.8-5.9-3.6-10.5-6.4-12.7-3.1 3.6-3.6 13.2 1.3 17.2-2.8 2.4-8.5 4.7-14.5 4.7h-72.2c-.6 6.2.5 11.9 3 16.8l.8 1.5c.5.9 1.1 1.7 1.7 2.6 3 .2 5.7.3 8.2.2 4.9-.1 8.9-.7 12-1.7.5-.2.9.1 1.1.5.2.5-.1.9-.5 1.1-.4.1-.8.3-1.3.4-2.4.7-5 1.1-8.3 1.3h-.6000000000000001c-1.3.1-2.7.1-4.2.1-1.6 0-3.1 0-4.9-.1 6 6.8 15.4 10.8 27.2 10.8 25 0 46.2-11.1 55.5-35.9 6.7.7 13.1-1 16-6.7-4.5-2.7-10.5-1.8-13.9-.1z"
/><path
fill="#28B8EB"
d="M110.4 55.1c.8-5.9-3.6-10.5-6.4-12.7-3.1 3.6-3.6 13.2 1.3 17.2-2.8 2.4-8.5 4.7-14.5 4.7h-68c-.3 9.5 3.2 16.7 9.5 21 4.9-.1 8.9-.7 12-1.7.5-.2.9.1 1.1.5.2.5-.1.9-.5 1.1-.4.1-.8.3-1.3.4-2.4.7-5.2 1.2-8.5 1.4l-.1-.1c8.5 4.4 20.8 4.3 35-1.1 15.8-6.1 30.6-17.7 40.9-30.9-.2.1-.4.1-.5.2z"
/><path
fill="#028BB8"
d="M18.7 71.8c.4 3.3 1.4 6.4 2.9 9.3l.8 1.5c.5.9 1.1 1.7 1.7 2.6 3 .2 5.7.3 8.2.2 4.9-.1 8.9-.7 12-1.7.5-.2.9.1 1.1.5.2.5-.1.9-.5 1.1-.4.1-.8.3-1.3.4-2.4.7-5.2 1.2-8.5 1.4h-.4c-1.3.1-2.7.1-4.1.1-1.6 0-3.2 0-4.9-.1 6 6.8 15.5 10.8 27.3 10.8 21.4 0 40-8.1 50.8-26h-85.1v-.1z"
/><path
fill="#019BC6"
d="M23.5 71.8c1.3 5.8 4.3 10.4 8.8 13.5 4.9-.1 8.9-.7 12-1.7.5-.2.9.1 1.1.5.2.5-.1.9-.5 1.1-.4.1-.8.3-1.3.4-2.4.7-5.2 1.2-8.6 1.4 8.5 4.4 20.8 4.3 34.9-1.1 8.5-3.3 16.8-8.2 24.2-14.1h-70.6z"
/><path
fill-rule="evenodd"
clip-rule="evenodd"
fill="#00ACD3"
d="M28.4 52.7h9.8v9.8h-9.8v-9.8zm.8.8h.8v8.1h-.8v-8.1zm1.4 0h.8v8.1h-.8v-8.1zm1.5 0h.8v8.1h-.8v-8.1zm1.5 0h.8v8.1h-.8v-8.1zm1.5 0h.8v8.1h-.8v-8.1zm1.5 0h.8v8.1h-.8v-8.1zM39.6 41.5h9.8v9.8h-9.8v-9.8zm.9.8h.8v8.1h-.8v-8.1zm1.4 0h.8v8.1h-.8v-8.1zm1.5 0h.8v8.1h-.8v-8.1zm1.5 0h.8v8.1h-.8v-8.1zm1.4 0h.8v8.1h-.8v-8.1zm1.5 0h.8v8.1h-.8v-8.1z"
/><path
fill-rule="evenodd"
clip-rule="evenodd"
fill="#23C2EE"
d="M39.6 52.7h9.8v9.8h-9.8v-9.8zm.9.8h.8v8.1h-.8v-8.1zm1.4 0h.8v8.1h-.8v-8.1zm1.5 0h.8v8.1h-.8v-8.1zm1.5 0h.8v8.1h-.8v-8.1zm1.4 0h.8v8.1h-.8v-8.1zm1.5 0h.8v8.1h-.8v-8.1z"
/><path
fill-rule="evenodd"
clip-rule="evenodd"
fill="#00ACD3"
d="M50.9 52.7h9.8v9.8h-9.8v-9.8zm.8.8h.8v8.1h-.8v-8.1zm1.5 0h.8v8.1h-.8v-8.1zm1.5 0h.8v8.1h-.8v-8.1zm1.4 0h.8v8.1h-.8v-8.1zm1.5 0h.8v8.1h-.8v-8.1zm1.5 0h.8v8.1h-.8v-8.1z"
/><path
fill-rule="evenodd"
clip-rule="evenodd"
fill="#23C2EE"
d="M50.9 41.5h9.8v9.8h-9.8v-9.8zm.8.8h.8v8.1h-.8v-8.1zm1.5 0h.8v8.1h-.8v-8.1zm1.5 0h.8v8.1h-.8v-8.1zm1.4 0h.8v8.1h-.8v-8.1zm1.5 0h.8v8.1h-.8v-8.1zm1.5 0h.8v8.1h-.8v-8.1zM62.2 52.7h9.8v9.8h-9.8v-9.8zm.8.8h.8v8.1h-.8v-8.1zm1.5 0h.8v8.1h-.8v-8.1zm1.4 0h.8v8.1h-.8v-8.1zm1.5 0h.8v8.1h-.8v-8.1zm1.5 0h.8v8.1h-.8v-8.1zm1.5 0h.8v8.1h-.8v-8.1z"
/><path
fill-rule="evenodd"
clip-rule="evenodd"
fill="#00ACD3"
d="M62.2 41.5h9.8v9.8h-9.8v-9.8zm.8.8h.8v8.1h-.8v-8.1zm1.5 0h.8v8.1h-.8v-8.1zm1.4 0h.8v8.1h-.8v-8.1zm1.5 0h.8v8.1h-.8v-8.1zm1.5 0h.8v8.1h-.8v-8.1zm1.5 0h.8v8.1h-.8v-8.1z"
/><path
fill-rule="evenodd"
clip-rule="evenodd"
fill="#23C2EE"
d="M62.2 30.2h9.8v9.8h-9.8v-9.8zm.8.8h.8v8.1h-.8v-8.1zm1.5 0h.8v8.1h-.8v-8.1zm1.4 0h.8v8.1h-.8v-8.1zm1.5 0h.8v8.1h-.8v-8.1zm1.5 0h.8v8.1h-.8v-8.1zm1.5 0h.8v8.1h-.8v-8.1z"
/><path
fill-rule="evenodd"
clip-rule="evenodd"
fill="#00ACD3"
d="M73.5 52.7h9.8v9.8h-9.8v-9.8zm.8.8h.8v8.1h-.8v-8.1zm1.4 0h.8v8.1h-.8v-8.1zm1.5 0h.8v8.1h-.8v-8.1zm1.5 0h.8v8.1h-.8v-8.1zm1.5 0h.8v8.1h-.8v-8.1zm1.5 0h.8v8.1h-.8v-8.1z"
/><path
fill-rule="evenodd"
clip-rule="evenodd"
fill="#D4EEF1"
d="M48.8 78.3c1.5 0 2.7 1.2 2.7 2.7 0 1.5-1.2 2.7-2.7 2.7-1.5 0-2.7-1.2-2.7-2.7 0-1.5 1.2-2.7 2.7-2.7"
/><path
fill-rule="evenodd"
clip-rule="evenodd"
fill="#3A4D54"
d="M48.8 79.1c.2 0 .5 0 .7.1-.2.1-.4.4-.4.7 0 .4.4.8.8.8.3 0 .6-.2.7-.4.1.2.1.5.1.7 0 1.1-.9 1.9-1.9 1.9-1.1 0-1.9-.9-1.9-1.9 0-1 .8-1.9 1.9-1.9M1.1 72.8h125.4c-2.7-.7-8.6-1.6-7.7-5.2-5 5.7-16.9 4-20 1.2-3.4 4.9-23 3-24.3-.8-4.2 5-17.3 5-21.5 0-1.4 3.8-21 5.7-24.3.8-3 2.8-15 4.5-20-1.2 1.1 3.5-4.9 4.5-7.6 5.2"
/><path
fill="#BFDBE0"
d="M56 97.8c-6.7-3.2-10.3-7.5-12.4-12.2-2.5.7-5.5 1.2-8.9 1.4-1.3.1-2.7.1-4.1.1-1.7 0-3.4 0-5.2-.1 6 6 13.6 10.7 27.5 10.8h3.1z"
/><path
fill="#D4EEF1"
d="M46.1 89.9c-.9-1.3-1.8-2.8-2.5-4.3-2.5.7-5.5 1.2-8.9 1.4 2.3 1.2 5.7 2.4 11.4 2.9z"
/></g
>
</svg>
<svg viewBox="0 0 128 128" class={isAbsolute ? 'absolute top-0 left-0 -m-5 h-12 w-12' : 'mx-auto w-10 h-10'}>
<path d="M124.8 52.1c-4.3-2.5-10-2.8-14.8-1.4-.6-5.2-4-9.7-8-12.9l-1.6-1.3-1.4 1.6c-2.7 3.1-3.5 8.3-3.1 12.3.3 2.9 1.2 5.9 3 8.3-1.4.8-2.9 1.9-4.3 2.4-2.8 1-5.9 2-8.9 2H79V49H66V24H51v12H26v13H13v14H1.8l-.2 1.5c-.5 6.4.3 12.6 3 18.5l1.1 2.2.1.2c7.9 13.4 21.7 19 36.8 19 29.2 0 53.3-13.1 64.3-40.6 7.4.4 15-1.8 18.6-8.9l.9-1.8-1.6-1zM28 39h10v11H28V39zm13.1 44.2c0 1.7-1.4 3.1-3.1 3.1-1.7 0-3.1-1.4-3.1-3.1 0-1.7 1.4-3.1 3.1-3.1 1.7.1 3.1 1.4 3.1 3.1zM28 52h10v11H28V52zm-13 0h11v11H15V52zm27.7 50.2c-15.8-.1-24.3-5.4-31.3-12.4 2.1.1 4.1.2 5.9.2 1.6 0 3.2 0 4.7-.1 3.9-.2 7.3-.7 10.1-1.5 2.3 5.3 6.5 10.2 14 13.8h-3.4zM51 63H40V52h11v11zm0-13H40V39h11v11zm13 13H53V52h11v11zm0-13H53V39h11v11zm0-13H53V26h11v11zm13 26H66V52h11v11zM38.8 81.2c-.2-.1-.5-.2-.8-.2-1.2 0-2.2 1-2.2 2.2 0 1.2 1 2.2 2.2 2.2s2.2-1 2.2-2.2c0-.3-.1-.6-.2-.8-.2.3-.4.5-.8.5-.5 0-.9-.4-.9-.9.1-.4.3-.7.5-.8z" fill="#019BC6"></path>
</svg>

View File

@@ -16,4 +16,6 @@
<Icons.Redis {isAbsolute} />
{:else if type === 'couchdb'}
<Icons.CouchDB {isAbsolute} />
{:else if type === 'edgedb'}
<Icons.EdgeDB {isAbsolute} />
{/if}

View File

@@ -0,0 +1,22 @@
<script lang="ts">
export let isAbsolute = false;
</script>
<svg
class={isAbsolute ? 'absolute top-0 left-0 -m-8 h-16 w-16' : 'mx-auto w-12 h-12'}
width="88"
fill="#1F8AED"
height="101"
viewBox="0 -15 88 101"
><path
class="pageNav_logoBar__2v4ah"
style="transform-origin:center 35.5px"
fill-rule="evenodd"
clip-rule="evenodd"
d="M55.1436 71H58.1436V0H55.1436V71Z"
/><path
fill-rule="evenodd"
clip-rule="evenodd"
d="M74.5362 35.3047C74.5362 41.3776 72.1013 42.4662 69.3799 42.4662H63.5935V28.1432H69.3799C72.1013 28.1432 74.5362 29.2318 74.5362 35.3047V35.3047ZM71.5862 35.3047C71.5862 31.0651 70.2971 30.8646 68.4352 30.8646H66.6305V39.7448H68.4352C70.2971 39.7448 71.5862 39.5443 71.5862 35.3047V35.3047ZM40.9348 42.4662V28.1432H50.0442V30.8646H43.9713V33.7865H48.5546V36.4792H43.9713V39.7448H50.0442V42.4662H40.9348ZM80.6092 36.1068V39.7448H83.13C84.7055 39.7448 85.1066 38.7135 85.1066 37.9401C85.1066 37.3385 84.8201 36.1068 82.6717 36.1068H80.6092ZM80.6092 30.8646V33.5859H82.6717C83.8462 33.5859 84.5337 33.0703 84.5337 32.2109C84.5337 31.3516 83.8462 30.8646 82.6717 30.8646H80.6092ZM77.5732 28.1432H83.4169C86.482 28.1432 87.3987 30.2917 87.3987 31.8385C87.3987 33.2708 86.482 34.3021 85.8518 34.5885C87.6851 35.4766 88.0002 37.2813 88.0002 38.1979C88.0002 39.401 87.3987 42.4662 83.4169 42.4662H77.5732V28.1432ZM23.4899 35.3047C23.4899 41.3776 21.055 42.4662 18.3337 42.4662H12.5472V28.1432H18.3337C21.055 28.1432 23.4899 29.2318 23.4899 35.3047V35.3047ZM32.4272 39.8594C33.974 39.8594 34.7761 39.3438 35.0626 39V37.4245H32.599V34.9609H37.4975V40.6615C37.0678 41.3203 34.7188 42.6094 32.5704 42.6094C29.047 42.6094 26.0678 41.2344 26.0678 35.1615C26.0678 29.0885 29.0756 28 31.797 28C36.0652 28 37.1251 30.2344 37.4688 32.2109L34.948 32.7839C34.8048 31.8672 34.0027 30.7214 32.1694 30.7214C30.3074 30.7214 29.0183 30.9219 29.0183 35.1615C29.0183 39.401 30.3647 39.8594 32.4272 39.8594V39.8594ZM20.539 35.3047C20.539 31.0651 19.2499 30.8646 17.3879 30.8646H15.5833V39.7448H17.3879C19.2499 39.7448 20.539 39.5443 20.539 35.3047V35.3047ZM0 42.4662V28.1432H9.10938V30.8646H3.03646V33.7865H7.61979V36.4792H3.03646V39.7448H9.10938V42.4662H0Z"
/></svg
>

View File

@@ -6,5 +6,6 @@ export { default as MongoDB } from './MongoDB.svelte';
export { default as MySQL } from './MySQL.svelte';
export { default as PostgreSQL } from './PostgreSQL.svelte';
export { default as Redis } from './Redis.svelte';
export { default as EdgeDB } from './EdgeDB.svelte';

View File

@@ -4,7 +4,7 @@
<svg
xmlns="http://www.w3.org/2000/svg"
class={isAbsolute ? 'w-14 h-14 absolute top-0 left-0 -m-5' : 'w-9 mx-auto'}
class={isAbsolute ? 'w-14 h-14 absolute top-0 left-0 -m-5' : 'w-9 h-9 mx-auto'}
viewBox="0 0 384 384"
fill="none"
>

View File

@@ -5,7 +5,7 @@
<svg
viewBox="0 0 700 240"
xmlns="http://www.w3.org/2000/svg"
class={isAbsolute ? 'w-36 absolute top-0 left-0 -m-3 -mt-5' : 'w-28 mx-auto'}
class={isAbsolute ? 'w-36 absolute top-0 left-0 -m-3 -mt-5' : 'w-28 h-28 mx-auto'}
><path fill="#FDBC3D" d="m90.694 107.498-.981.39-20.608 8.23 6.332 6.547z" /><path
fill="#8EC63F"
d="M61.139 77.914 46.632 93 56.9 103.547c8.649-7.169 17.832-10.502 18.653-10.789L61.139 77.914z"

View File

@@ -4,6 +4,6 @@
<img
alt="ghost logo"
class={isAbsolute ? 'w-12 absolute top-0 left-0 -m-3 -mt-5' : 'w-8 mx-auto'}
class={isAbsolute ? 'w-12 h-12 absolute top-0 left-0 -m-3 -mt-5' : 'w-8 h-8 mx-auto'}
src="/ghost.png"
/>

View File

@@ -3,7 +3,7 @@
</script>
<svg
class={isAbsolute ? 'w-10 h-10 absolute top-0 left-0 -m-5' : 'w-8 mx-auto'}
class={isAbsolute ? 'w-10 h-10 absolute top-0 left-0 -m-5' : 'w-8 h-8 mx-auto'}
xmlns="http://www.w3.org/2000/svg"
xmlns:xlink="http://www.w3.org/1999/xlink"
style="isolation:isolate"

View File

@@ -3,7 +3,7 @@
</script>
<svg
class={isAbsolute ? 'w-10 h-10 absolute top-0 left-0 -m-5' : 'w-8 mx-auto'}
class={isAbsolute ? 'w-10 h-10 absolute top-0 left-0 -m-5' : 'w-8 h-8 mx-auto'}
viewBox="0 0 81 84"
fill="none"
xmlns="http://www.w3.org/2000/svg"

View File

@@ -4,7 +4,7 @@
<svg
xmlns="http://www.w3.org/2000/svg"
class={isAbsolute ? 'w-10 h-10 absolute top-0 left-0 -m-5' : 'w-8 mx-auto'}
class={isAbsolute ? 'w-10 h-10 absolute top-0 left-0 -m-5' : 'w-8 h-8 mx-auto'}
fill="none"
viewBox="0 0 140 140"
data-lt-extension-installed="true"

View File

@@ -4,7 +4,7 @@
<svg
viewBox="0 0 127 74"
class={isAbsolute ? 'w-10 h-10 absolute top-0 left-0 -m-5' : 'w-8 mx-auto'}
class={isAbsolute ? 'w-10 h-10 absolute top-0 left-0 -m-5' : 'w-8 h-8 mx-auto'}
xmlns="http://www.w3.org/2000/svg"
><path
d="M.825 73.993l23.244-59.47A21.85 21.85 0 0144.42.625h14.014L35.19 60.096a21.85 21.85 0 01-20.352 13.897H.825z"

View File

@@ -4,6 +4,6 @@
<img
alt="minio logo"
class={isAbsolute ? 'w-7 absolute top-0 left-0 -m-3 -mt-5' : 'w-4 mx-auto'}
class={isAbsolute ? 'w-7 h-12 absolute top-0 left-0 -m-3 -mt-5' : 'w-4 h-8 mx-auto'}
src="/minio.png"
/>

View File

@@ -3,6 +3,6 @@
</script>
<img
alt="moodle logo"
class={isAbsolute ? 'w-9 absolute top-0 left-0 -m-3 -mt-5' : 'w-8 mx-auto'}
class={isAbsolute ? 'w-9 h-9 absolute top-0 left-0 -m-3 -mt-5' : 'w-8 h-8 mx-auto'}
src="/moodle.png"
/>

View File

@@ -3,7 +3,7 @@
</script>
<svg
class={isAbsolute ? 'w-12 h-12 absolute top-0 left-0 -m-5' : 'w-8 mx-auto'}
class={isAbsolute ? 'w-12 h-12 absolute top-0 left-0 -m-5' : 'w-8 h-8 mx-auto'}
viewBox="0 0 220 105"
>
<g>

View File

@@ -4,6 +4,6 @@
<img
alt="nocodb logo"
class={isAbsolute ? 'w-10 absolute top-0 left-0 -m-5' : 'w-8 mx-auto'}
class={isAbsolute ? 'w-10 h-10 absolute top-0 left-0 -m-5' : 'w-8 h-8 mx-auto'}
src="/nocodb.png"
/>

View File

@@ -4,6 +4,6 @@
<img
alt="plausible logo"
class={isAbsolute ? 'w-9 absolute top-0 left-0 -m-4' : 'w-6 mx-auto'}
class={isAbsolute ? 'w-9 h-12 absolute top-0 left-0 -m-4' : 'w-6 h-8 mx-auto'}
src="/plausible.png"
/>

View File

@@ -0,0 +1,56 @@
<script lang="ts">
export let isAbsolute = false;
</script>
<svg
xmlns:cc="http://creativecommons.org/ns#"
xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#"
xmlns:svg="http://www.w3.org/2000/svg"
xmlns="http://www.w3.org/2000/svg"
id="svg8"
version="1.1"
viewBox="0 0 92 92"
class={isAbsolute ? 'w-12 h-12 absolute top-0 left-0 -m-3 -mt-5' : 'w-8 h-8 mx-auto'}
>
<defs id="defs2" />
<metadata id="metadata5">
<rdf:RDF>
<cc:Work rdf:about="">
<dc:format>image/svg+xml</dc:format>
<dc:type rdf:resource="http://purl.org/dc/dcmitype/StillImage" />
<dc:title />
</cc:Work>
</rdf:RDF>
</metadata>
<g transform="translate(-40.921303,-17.416526)" id="layer1">
<circle
r="0"
style="fill:none;stroke:#000000;stroke-width:12;stroke-miterlimit:4;stroke-dasharray:none;stroke-opacity:1"
cy="92"
cx="75"
id="path3713"
/>
<circle
r="30"
cy="53.902557"
cx="75.921303"
id="path834"
style="fill:none;fill-opacity:1;stroke:#3050ff;stroke-width:10;stroke-miterlimit:4;stroke-dasharray:none;stroke-opacity:1"
/>
<path
d="m 67.514849,37.91524 a 18,18 0 0 1 21.051475,3.312407 18,18 0 0 1 3.137312,21.078282"
id="path852"
style="fill:none;fill-opacity:1;stroke:#3050ff;stroke-width:5;stroke-miterlimit:4;stroke-dasharray:none;stroke-opacity:1"
/>
<rect
transform="rotate(-46.234709)"
ry="1.8669105e-13"
y="122.08995"
x="3.7063529"
height="39.963303"
width="18.846331"
id="rect912"
style="opacity:1;fill:#3050ff;fill-opacity:1;stroke:none;stroke-width:8;stroke-miterlimit:4;stroke-dasharray:none;stroke-opacity:1"
/>
</g>
</svg>

View File

@@ -38,4 +38,8 @@
<Icons.Moodle {isAbsolute} />
{:else if type === 'glitchTip'}
<Icons.GlitchTip {isAbsolute} />
{:else if type === 'searxng'}
<Icons.Searxng {isAbsolute} />
{:else if type === 'weblate'}
<Icons.Weblate {isAbsolute} />
{/if}

View File

@@ -7,7 +7,7 @@
xmlns="http://www.w3.org/2000/svg"
viewBox="0 0 856.000000 856.000000"
preserveAspectRatio="xMidYMid meet"
class={isAbsolute ? 'w-10 h-10 absolute top-0 left-0 -m-5' : 'w-8 mx-auto'}
class={isAbsolute ? 'w-10 h-10 absolute top-0 left-0 -m-5' : 'w-8 h-8 mx-auto'}
>
<metadata> Created by potrace 1.11, written by Peter Selinger 2001-2013 </metadata>
<g

View File

@@ -3,7 +3,7 @@
</script>
<svg
class={isAbsolute ? 'w-10 h-10 absolute top-0 left-0 -m-5' : 'w-8 mx-auto'}
class={isAbsolute ? 'w-10 h-10 absolute top-0 left-0 -m-5' : 'w-8 h-8mx-auto'}
viewBox="0 0 128 128"
>
<path

View File

@@ -3,7 +3,7 @@
</script>
<svg
class={isAbsolute ? 'w-10 absolute top-0 left-0 -m-5' : 'w-8 mx-auto'}
class={isAbsolute ? 'w-10 h-10 absolute top-0 left-0 -m-5' : 'w-8 h-8mx-auto'}
xmlns="http://www.w3.org/2000/svg"
xmlns:xlink="http://www.w3.org/1999/xlink"
version="1.1"

View File

@@ -0,0 +1,61 @@
<script lang="ts">
export let isAbsolute = false;
</script>
<svg
xmlns="http://www.w3.org/2000/svg"
class={isAbsolute ? 'w-16 h-16 absolute top-0 left-0 -m-7' : 'w-12 h-12 mx-auto'}
version="1.1"
viewBox="0 0 300 300"
><linearGradient
id="a"
x1=".3965"
x2="98.808"
y1="55.253"
y2="55.253"
gradientTransform="scale(.98308 1.0172)"
gradientUnits="userSpaceOnUse"
><stop stop-color="#00d2e6" offset="0" /><stop
stop-color="#2eccaa"
offset="1"
/></linearGradient
><linearGradient
id="b"
x1="49.017"
x2="99.793"
y1="137.89"
y2="113.96"
gradientTransform="scale(1.1631 .8598)"
gradientUnits="userSpaceOnUse"
><stop stop-opacity="0" offset="0" /><stop offset=".51413" /><stop
stop-opacity="0"
offset="1"
/></linearGradient
><linearGradient
id="c"
x1="201.82"
x2="103.58"
y1="57.649"
y2="57.649"
gradientTransform="scale(.98308 1.0172)"
gradientUnits="userSpaceOnUse"
><stop stop-color="#1fa385" offset="0" /><stop
stop-color="#2eccaa"
offset="1"
/></linearGradient
><g transform="translate(50,76)" fill-rule="evenodd"
><path
d="m127.25 111.61c-2.8884-0.0145-5.7666-0.6024-8.4797-1.7847-6.1117-2.6626-11.493-7.6912-15.872-14.495 1.2486-2.2193 2.3738-4.5173 3.3784-6.8535 4.4051-10.243 6.5-21.46 6.6607-32.593-0.0233-0.22082-0.0416-0.44244-0.0552-0.66483l-0.0121-0.57132c-0.01-4.3654-0.67459-8.7898-2.1767-12.909-1.7304-4.7458-4.4887-9.4955-8.865-11.348-0.79519-0.33595-1.6316-0.47701-2.4642-0.45737-5.5049-10.289-5.6799-20.149 0-29.537 0.10115 0 0.20619 3.9293e-4 0.30734 0.001179 6.7012 0.07387 13.34 2.1418 19.021 5.7536 15.469 9.835 23.182 29.001 23.352 47.818 2e-3 0.22083-3.9e-4 0.44126-7e-3 0.66169h0.0868c-0.0226 19.887-4.8049 40.054-14.875 56.979zm-34.3 31.216c-14.448 5.9425-31.228 5.6236-45.549-1.025-16.476-7.6476-29.065-22.512-36.818-39.479-13.262-29.022-13.566-63.715-0.98815-93.182 9.4458 3.7788 17.845-2.2397 17.845-2.2397s-0.01945 9.2605 8.9478 13.905c-9.2007 21.556-8.979 47.167 0.2412 68.173 4.4389 10.107 11.22 19.519 20.619 24.842 3.3547 1.8996 7.041 3.126 10.833 3.5862 0.01404 0.0219 0.02808 0.0439 0.04214 0.0658 6.6965 10.449 15.132 19.157 24.828 25.354z"
fill="url(#a)"
fill-rule="nonzero"
/><path
d="m127.24 111.61c-2.8869-0.0151-5.7636-0.60296-8.4755-1.7846-6.1127-2.663-11.495-7.6928-15.874-14.498 1.2494-2.2205 2.3754-4.5198 3.3806-6.8572 1.3282-3.0884 2.4463-6.2648 3.3644-9.501 2.128-7.4978 30.382 2.0181 26.072 14.371-2.2239 6.373-5.0394 12.509-8.4675 18.27zm-34.302 31.212c-14.446 5.9396-31.224 5.6198-45.543-1.0278-16.476-7.6476 0.44739-33.303 9.8465-27.981 3.3533 1.8988 7.0378 3.125 10.828 3.5856 0.01567 0.0245 0.03135 0.049 0.04704 0.0735 6.695 10.447 15.128 19.153 24.821 25.349z"
fill="url(#b)"
opacity=".3"
/><path
d="m56.762 54.628c-0.0066-0.22043-0.0093-0.44086-7e-3 -0.66169 0.17001-18.817 7.8827-37.983 23.352-47.818 5.6811-3.6118 12.32-5.6798 19.021-5.7536 0.10115-7.8585e-4 0.20619-0.001179 0.30734-0.001179v29.537c-0.83254-0.01965-1.669 0.12141-2.4642 0.45737-4.3763 1.8523-7.1345 6.602-8.865 11.348-1.5021 4.1191-2.1669 8.5434-2.1767 12.909l-0.01206 0.57132c-0.01362 0.2224-0.0319 0.44401-0.05524 0.66483 0.16067 11.134 2.2556 22.35 6.6607 32.593 4.9334 11.472 12.775 22.025 23.847 26.849 8.3526 3.6397 17.612 2.7811 25.182-1.5057 9.3991-5.3226 16.18-14.734 20.619-24.842 9.2202-21.006 9.4419-46.617 0.24121-68.173 8.9673-4.6444 8.9478-13.905 8.9478-13.905s8.3993 6.0185 17.845 2.2397c12.578 29.466 12.274 64.16-0.98815 93.182-7.7535 16.967-20.343 31.831-36.818 39.479-14.667 6.809-31.913 6.9792-46.591 0.58389-13.19-5.7489-23.918-16.106-31.637-28.15-11.179-17.443-16.472-38.678-16.496-59.604z"
fill="url(#c)"
fill-rule="nonzero"
/></g
></svg
>

Some files were not shown because too many files have changed in this diff Show More