Compare commits

...

433 Commits

Author SHA1 Message Date
Andras Bacsai
3b9b3f8ffa Merge pull request #708 from coollabsio/next
Fixes for v3.11.0
2022-11-07 15:21:38 +01:00
Andras Bacsai
fda8823050 fix: plausible analytics things 2022-11-07 14:59:39 +01:00
Andras Bacsai
b5756cb14f save templates 2022-11-07 14:10:37 +01:00
Andras Bacsai
617d3dbe52 fix: templates 2022-11-07 13:48:57 +01:00
Andras Bacsai
c18beb1c7c fix: templates 2022-11-07 13:40:18 +01:00
Andras Bacsai
a6957b919c fix: template 2022-11-07 13:38:55 +01:00
Andras Bacsai
816a362534 fix: confirm restart service 2022-11-07 13:35:45 +01:00
Andras Bacsai
7ce3ebde4e fix: templates 2022-11-07 13:30:58 +01:00
Andras Bacsai
cc2f83c4d9 Merge pull request #705 from coollabsio/next
v3.11.0
2022-11-07 12:02:32 +01:00
Andras Bacsai
6ce492049e fix 2022-11-07 11:52:34 +01:00
Andras Bacsai
a7999de4b0 fix: compose icon 2022-11-07 11:27:17 +01:00
Andras Bacsai
d4bdfabf19 chore: version++ 2022-11-07 11:03:16 +01:00
Andras Bacsai
85030ab804 fix: template files 2022-11-07 10:44:50 +01:00
Andras Bacsai
2a9bd00a50 fixes 2022-11-07 09:36:51 +01:00
Andras Bacsai
1c2d76e651 UI updates 2022-11-07 08:59:06 +01:00
Andras Bacsai
a97f7d225a fix: remove old minio proxies 2022-11-04 21:34:24 +01:00
Andras Bacsai
2781848aac cleanup 2022-11-04 21:15:08 +01:00
Andras Bacsai
d179da2bee fix 2022-11-04 21:11:38 +01:00
Andras Bacsai
60e7922734 stop minio proxy on restart 2022-11-04 21:07:19 +01:00
Andras Bacsai
1ece37ec3c update template 2022-11-04 15:20:31 +01:00
Andras Bacsai
8dad865146 contribution guide 2022-11-04 14:52:21 +01:00
Andras Bacsai
80d15e782b fixes 2022-11-04 14:41:52 +01:00
Andras Bacsai
d24e4c6518 fix icons 2022-11-04 14:24:22 +01:00
Andras Bacsai
6def46544c fix: wh catchall for all 2022-11-04 12:11:04 +01:00
Andras Bacsai
d66bae32d3 fix: preview wbh 2022-11-04 12:08:26 +01:00
Andras Bacsai
1b753a4020 fix: Pr stopps main deployment 2022-11-04 12:08:20 +01:00
Andras Bacsai
25e6a74a0a fix: wb for previews 2022-11-04 11:45:47 +01:00
Andras Bacsai
afde00a4be fix: websecure redirect 2022-11-04 11:44:04 +01:00
Andras Bacsai
0022d380bb fix: webhooks 2022-11-04 11:26:43 +01:00
Andras Bacsai
d80d2ab934 fix: previews wbh 2022-11-04 10:52:08 +01:00
Andras Bacsai
13c1734753 feat: redirect catch-all rule 2022-11-04 10:50:16 +01:00
Andras Bacsai
bf33d6c34e fix: remote webhooks 2022-11-04 09:58:37 +01:00
Andras Bacsai
1b02f9bd5d fix: webhook simplified 2022-11-04 09:54:13 +01:00
Andras Bacsai
9f63c645ff fix: load public repos 2022-11-04 09:53:57 +01:00
Andras Bacsai
abf271fb68 fixes 2022-11-03 15:54:39 +01:00
Andras Bacsai
363755c3bf fix doclinks 2022-11-03 15:37:39 +01:00
Andras Bacsai
ba2db666aa updates 2022-11-03 15:17:31 +01:00
Andras Bacsai
9f3677b694 updates 2022-11-03 14:59:37 +01:00
Andras Bacsai
e6024c997f debug more 2022-11-03 14:41:58 +01:00
Andras Bacsai
3cb83e2286 debug 2022-11-03 14:28:53 +01:00
Andras Bacsai
780d03e5e1 fixes 2022-11-03 14:16:51 +01:00
Andras Bacsai
214114e6ce fixes 2022-11-03 14:09:06 +01:00
Andras Bacsai
274d3fe679 fix wh again 2022-11-03 13:50:04 +01:00
Andras Bacsai
0ecf86d8a3 remove debug 2022-11-03 13:47:49 +01:00
Andras Bacsai
c2d4390a72 fix wh 2022-11-03 13:46:51 +01:00
Andras Bacsai
7627d59d43 debug 2022-11-03 13:37:48 +01:00
Andras Bacsai
71ce9a6b37 fix: pathprefix 2022-11-03 13:28:58 +01:00
Andras Bacsai
232018c925 fix 2022-11-03 11:40:55 +01:00
Andras Bacsai
9dfbbe58ff fix: toast, rde, webhooks 2022-11-03 11:32:18 +01:00
Andras Bacsai
fa9738a2e0 fix: tooltip 2022-11-03 10:13:36 +01:00
Andras Bacsai
94ecbc5921 fix: app logs view 2022-11-03 09:43:41 +01:00
Andras Bacsai
3c68d317d7 fix: traefik proxy q 10s 2022-11-03 09:43:34 +01:00
Andras Bacsai
56d4edfb9d fix: toast 2022-11-03 09:43:23 +01:00
Andras Bacsai
c6c037ff17 fixes 2022-11-03 09:31:01 +01:00
Andras Bacsai
44feba4d89 fix branches 2022-11-03 08:14:57 +01:00
Andras Bacsai
962f2c7380 Merge pull request #682 from Huskehhh/main
fix: expose port is readonly on the wrong condition
2022-11-02 22:49:43 +01:00
Andras Bacsai
22007426aa fix migration 2022-11-02 22:44:46 +01:00
Andras Bacsai
008e9a92d3 fixes 2022-11-02 22:32:09 +01:00
Andras Bacsai
41139ee2ab fixes 2022-11-02 22:11:32 +01:00
Andras Bacsai
845c40d23c fixes 2022-11-02 21:31:16 +01:00
Andras Bacsai
a22f26c4c8 fixes 2022-11-02 16:03:27 +01:00
Andras Bacsai
99ff020f56 fix 2022-11-02 15:37:51 +01:00
Andras Bacsai
f863b42b71 fix: heroku bp 2022-11-02 15:36:23 +01:00
Andras Bacsai
2e713b459e fixes 2022-11-02 15:19:20 +01:00
Andras Bacsai
923241ce1e fixes 2022-11-02 10:08:22 +01:00
Andras Bacsai
3a8929b9d7 fix 2022-11-02 09:58:14 +01:00
Andras Bacsai
eb92d39d40 replace ws with socketio 2022-11-02 09:49:21 +01:00
Andras Bacsai
bdc62a007e fixes 2022-10-28 20:30:34 +00:00
Andras Bacsai
4b35db6291 wss 2022-10-28 14:51:22 +00:00
Andras Bacsai
c8282b215d use window location for ws in prod 2022-10-28 14:29:21 +00:00
Andras Bacsai
c123669828 test ws 2022-10-28 15:50:57 +02:00
Andras Bacsai
781fd0a1cd fixes 2022-10-28 12:03:07 +02:00
Andras Bacsai
9bd99605fb stop wp ftp on service stop 2022-10-28 12:02:22 +02:00
Andras Bacsai
dc626bd4f0 fixes 2022-10-28 11:54:03 +02:00
Andras Bacsai
aa27aeafa1 fix fqdn check 2022-10-28 09:15:03 +02:00
Andras Bacsai
cdb25cd0e9 remove console.log 2022-10-27 20:06:21 +00:00
Andras Bacsai
dc2d15fd9c fix 2022-10-27 20:05:02 +00:00
Andras Bacsai
55cb788380 fixes 2022-10-27 19:21:33 +00:00
Andras Bacsai
0f3b7fe643 fixes 2022-10-27 18:55:21 +00:00
Andras Bacsai
4b812350a8 fix migrations 2022-10-27 13:37:45 +02:00
Andras Bacsai
aec37164de debug 2022-10-27 13:18:24 +02:00
Andras Bacsai
dec02bd8db update pnpm lock 2022-10-27 12:06:22 +02:00
Andras Bacsai
1bd6a8ed9e update dockerfile 2022-10-27 12:06:12 +02:00
Andras Bacsai
2030f714fa cleanup stuffs 2022-10-27 09:55:32 +02:00
Andras Bacsai
4416646954 package updates + tags selector 2022-10-26 15:50:10 +02:00
Andras Bacsai
52ba9dc02a Update devTemplates.yaml 2022-10-26 15:44:29 +02:00
Andras Bacsai
dad3d42d14 fixes 2022-10-26 14:12:27 +02:00
Andras Bacsai
0d12f3043b fix + cleanup 2022-10-26 13:44:32 +02:00
Andras Bacsai
1225786fc0 cleanup + fixes 2022-10-26 11:21:55 +02:00
Andras Bacsai
71496d5229 cleanup + arm support 2022-10-26 10:49:30 +02:00
Andras Bacsai
eb0aa20fe1 fixes 2022-10-26 10:27:33 +02:00
Andras Bacsai
c34de3d0a3 fixes 2022-10-26 10:12:17 +02:00
Andras Bacsai
54e0a9fc28 fix 2022-10-26 09:35:56 +02:00
Andras Bacsai
4bcd034b3d fixes 2022-10-26 09:27:43 +02:00
Andras Bacsai
111bd29cc8 updates 2022-10-25 22:43:31 +02:00
Andras Bacsai
b0fcd23ca6 template update 2022-10-25 15:59:04 +02:00
Andras Bacsai
f80b1d31f5 fixes, dev templates, etc 2022-10-25 15:12:40 +02:00
Andras Bacsai
811ea5b92a fixes 2022-10-24 22:54:19 +02:00
Andras Bacsai
f9dfbd5800 fix 2022-10-24 21:54:33 +02:00
Andras Bacsai
88f1c36929 fixes and remote tempaltes 2022-10-24 14:23:34 +02:00
Andras Bacsai
8bbe771f5b fake service in dev 2022-10-24 13:12:31 +02:00
Andras Bacsai
c578fa63e5 updates 2022-10-24 13:10:05 +02:00
Andras Bacsai
17badf95dc fix 2022-10-21 22:34:27 +02:00
Andras Bacsai
a267ee40d2 asd 2022-10-21 22:15:50 +02:00
Andras Bacsai
8ef645b3c2 fix 2022-10-21 22:13:29 +02:00
Andras Bacsai
35625b22f5 hm 2022-10-21 22:05:05 +02:00
Andras Bacsai
221dcefd6c fix 2022-10-21 21:24:52 +02:00
Andras Bacsai
9c74a9c1db fixes 2022-10-21 21:19:30 +02:00
Andras Bacsai
55fc3920fc updates 2022-10-21 20:54:33 +02:00
Andras Bacsai
5d60b5eb8b saving things 2022-10-21 15:51:32 +02:00
Andras Bacsai
049d5166e8 add template.json 2022-10-21 11:30:16 +02:00
Andras Bacsai
f4019db3d1 fixes 2022-10-20 16:06:33 +02:00
Andras Bacsai
9f3732d35b fix: service logs 2022-10-20 10:42:47 +02:00
Andras Bacsai
b4f17ac3c6 add weblate 2022-10-20 10:03:23 +02:00
Andras Bacsai
978e35d335 add searxng 2022-10-20 09:44:08 +02:00
Andras Bacsai
22cbbec960 add searxng 2022-10-20 09:18:13 +02:00
Andras Bacsai
21f3a70788 add glitchtip 2022-10-19 14:59:38 +02:00
Andras Bacsai
b4c6f80e1c add hasura 2022-10-19 14:15:48 +02:00
Andras Bacsai
e1198c42eb add hasura 2022-10-19 13:14:39 +02:00
Andras Bacsai
e09fdbcef0 add umami + hashedpws 2022-10-19 12:00:43 +02:00
Andras Bacsai
b708e79929 add umami 2022-10-19 11:26:27 +02:00
Andras Bacsai
cbaecff3b7 meilisearch 2022-10-19 10:55:16 +02:00
Andras Bacsai
4f7d2630af meilisearch 2022-10-19 10:29:08 +02:00
Andras Bacsai
92d3860240 ghost 2022-10-19 10:24:53 +02:00
Andras Bacsai
3757d5da9f ghost 2022-10-19 10:07:04 +02:00
Andras Bacsai
1d38a885bb add wordpress 2022-10-18 15:17:59 +02:00
Andras Bacsai
dbd767e8f1 wordpress 2022-10-18 15:01:18 +02:00
Andras Bacsai
8b83c38127 vscode 2022-10-18 14:45:30 +02:00
Andras Bacsai
f1ea01e709 vscodeserver + minio 2022-10-18 14:34:10 +02:00
Andras Bacsai
12a1aeb0f8 add minio 2022-10-18 14:12:33 +02:00
Andras Bacsai
413150012f fix 2022-10-18 13:57:48 +02:00
Andras Bacsai
8ef5604ce8 updates 2022-10-18 13:52:47 +02:00
Andras Bacsai
42e50c800b fix 2022-10-18 12:51:53 +02:00
Andras Bacsai
8fbd08003c fixes 2022-10-18 12:43:35 +02:00
Andras Bacsai
877577efdb plausible migration done 2022-10-18 12:02:09 +02:00
Andras Bacsai
a6f457749b lots of changes 2022-10-18 11:32:38 +02:00
Andras Bacsai
9afb713df1 add length option to template 2022-10-17 14:55:28 +00:00
Andras Bacsai
8f660c0276 work-work 2022-10-17 15:43:57 +02:00
Andras Bacsai
a7e86d9afd fix 2022-10-14 15:54:19 +02:00
Andras Bacsai
462eea90c0 tons of updates 2022-10-14 15:48:37 +02:00
Andras Bacsai
79c30dfc91 remove nix file 2022-10-14 09:45:45 +02:00
Jordyn
410a78b366 fix: expose port is readonly on the wrong condition 2022-10-14 10:49:53 +11:00
Andras Bacsai
065807a0bc fix: secret errors 2022-10-13 15:43:57 +02:00
Andras Bacsai
1d93658e56 Merge pull request #680 from coollabsio/next
v3.10.16
2022-10-12 22:05:39 +02:00
Andras Bacsai
2b7865e6ea update packages 2022-10-12 19:57:32 +00:00
Andras Bacsai
0cdba8c329 fix: single container logs and usage with compose 2022-10-12 19:53:21 +00:00
Andras Bacsai
11b317b788 ui: new resource label 2022-10-12 15:10:00 +02:00
Andras Bacsai
fb955e15f4 Merge pull request #656 from coollabsio/next
v3.10.15
2022-10-12 14:51:56 +02:00
Andras Bacsai
ae2d141f0d fix: pull does not work remotely on huge compose file 2022-10-12 14:27:13 +02:00
Andras Bacsai
68c983923e fix: dockerfile 2022-10-12 14:08:00 +02:00
Andras Bacsai
bf252f7f20 fix: appwrite v1 missing containers 2022-10-12 13:50:28 +02:00
Andras Bacsai
324038486f fixes 2022-10-12 12:02:47 +02:00
Andras Bacsai
bef5da49cf remove text 2022-10-12 11:43:47 +02:00
Andras Bacsai
24e7f547fa feat: monitoring by container 2022-10-12 11:42:45 +02:00
Andras Bacsai
3ee3ab0ad1 fix: port required if fqdn is set 2022-10-12 11:27:13 +02:00
Andras Bacsai
f734154da8 fix: check compose domains in general 2022-10-12 11:17:02 +02:00
Andras Bacsai
7a053ce697 fix: gitlab auth and compose reload 2022-10-12 11:11:18 +02:00
Andras Bacsai
25f250310e fix: dev container 2022-10-12 10:18:28 +02:00
Andras Bacsai
4eca05bbba fix: gh release 2022-10-12 10:07:18 +02:00
Andras Bacsai
45b0f791bb ci: update staging release 2022-10-11 10:54:22 +02:00
Andras Bacsai
42415a81c1 fix: update docker binaries 2022-10-11 10:54:13 +02:00
Andras Bacsai
6882e83d1e fix: logs for not running containers 2022-10-10 15:28:46 +02:00
Andras Bacsai
d4b7318413 fix: smart search for new services 2022-10-10 15:28:36 +02:00
Andras Bacsai
a2b4d400af fix: add git sha to build args 2022-10-10 15:28:16 +02:00
Andras Bacsai
f07868d24e fix: do not show nope as ip address for dbs 2022-10-10 15:28:02 +02:00
Andras Bacsai
d46ee049f4 Merge pull request #668 from bstst/bugfix/fix-deno-run-options
Fix deno options string
2022-10-10 15:26:00 +02:00
Andras Bacsai
c62eda5627 Merge pull request #666 from cmer/cmer-inject-git-sha
Inject GIT SHA into build
2022-10-10 15:17:18 +02:00
Martin Saulis
7683164ed2 fix deno options string 2022-10-07 14:59:16 +03:00
Carl Mercier
7090c16575 Update common.ts 2022-10-06 15:42:57 -04:00
Carl Mercier
9e634fed13 Inject GIT SHA into build process
As discussed at https://github.com/docker/hub-feedback/issues/600
2022-10-06 15:38:15 -04:00
Andras Bacsai
9bb125cebd feat: docker compose 2022-10-06 15:51:08 +02:00
Andras Bacsai
0c4850b91d fixes 2022-10-06 14:24:28 +02:00
Andras Bacsai
0eb7688c4d fixes 2022-10-06 14:15:05 +02:00
Andras Bacsai
f47cdb68d9 fixes 2022-10-06 12:01:24 +02:00
Andras Bacsai
d3c3cded37 debug: one less worker thread 2022-10-06 11:44:36 +02:00
Andras Bacsai
e91ea4ecbe feat: docker compose 2022-10-06 11:37:47 +02:00
Andras Bacsai
680b20d199 update dev container flow 2022-10-06 11:37:42 +02:00
Andras Bacsai
ec97e04fd4 revert last debug 2022-10-06 11:37:28 +02:00
Andras Bacsai
b0b2657fe0 debug: remove worker jobs 2022-10-06 10:26:40 +02:00
Andras Bacsai
d27426fd8f feat: docker compose support 2022-10-06 10:25:41 +02:00
Andras Bacsai
d8206c0e3e wip: docker compose 2022-10-05 15:34:52 +02:00
Andras Bacsai
3f1841a188 init: docker-compose support 2022-10-05 10:27:12 +00:00
Andras Bacsai
cb478e0dc8 add contribution guide on container based development flow 2022-10-05 09:13:51 +00:00
Andras Bacsai
02c42a7e3a fix: pure docker based development 2022-10-05 09:01:17 +00:00
Andras Bacsai
ef40f7349e Merge pull request #653 from coollabsio/next
v3.10.14
2022-10-05 09:24:39 +02:00
Andras Bacsai
86eebb35cb fix: do not use npx 2022-10-05 08:58:14 +02:00
Andras Bacsai
a901388887 revert 2022-10-04 23:06:46 +02:00
Andras Bacsai
6cd1c5de38 Dockerfile update 2022-10-04 22:22:29 +02:00
Andras Bacsai
7489f172a1 test prestart 2022-10-04 22:14:16 +02:00
Andras Bacsai
702798c275 revert things 2022-10-04 22:00:50 +02:00
Andras Bacsai
430d51866c test: remove prisma 2022-10-04 21:46:23 +02:00
Andras Bacsai
9d08421f01 dev intervals 2022-10-04 21:46:01 +02:00
Andras Bacsai
f4051874b2 Merge pull request #654 from vvvctr/patch-1
Proper capitalization for WordPress service type.
2022-10-04 21:14:52 +02:00
vvvctr
bbe0690056 Proper capitalization for WordPress service type. 2022-10-04 16:27:56 +02:00
Andras Bacsai
772c0d1e41 fix: nope if you are not logged in 2022-10-04 15:14:52 +02:00
Andras Bacsai
8eb9ca0260 fix: add buildkit features 2022-10-04 15:06:09 +02:00
Andras Bacsai
bd27afe0da fix: verify and configure remote docker engines 2022-10-04 15:01:47 +02:00
Andras Bacsai
a3af21275a fix: meilisearch data dir 2022-10-04 14:05:11 +02:00
Andras Bacsai
61eb155d13 chore: version++ 2022-10-04 14:05:01 +02:00
Andras Bacsai
7932c1c4a9 Merge pull request #648 from coollabsio/next
v3.10.13
2022-10-03 12:56:59 +02:00
Andras Bacsai
f776fb83e7 ui: settings icon 2022-10-03 11:45:24 +02:00
Andras Bacsai
a97521aba2 webhook: send 200 for ping and installation wh 2022-10-03 11:42:07 +02:00
Andras Bacsai
d1c0fe503e fix: remove unnecessary things 2022-10-03 11:32:15 +02:00
Andras Bacsai
ed02c1ae36 ui: iam & settings update 2022-10-03 11:31:50 +02:00
Andras Bacsai
9a67cf7355 fix: fork pr previews 2022-10-03 09:48:47 +02:00
Andras Bacsai
755eeda364 remove inspector 2022-10-03 09:25:31 +02:00
Andras Bacsai
136dee7747 ui: fix indicato 2022-10-03 09:20:57 +02:00
Andras Bacsai
e4e8428855 minify api 2022-10-02 11:08:04 +00:00
Andras Bacsai
de8dc021f9 fix: pr branches 2022-10-02 09:37:08 +00:00
Andras Bacsai
991587f252 fix: typo 2022-10-02 09:24:43 +00:00
Andras Bacsai
8dbcf257c4 fix: handle forked repositories 2022-10-02 09:16:51 +00:00
Andras Bacsai
0b067364a9 fix: default 0 pending invitations 2022-10-02 08:55:36 +00:00
Andras Bacsai
5367bd6134 show webhook details 2022-10-02 08:48:56 +00:00
Andras Bacsai
92228c4379 schema migration 2022-10-02 08:43:45 +00:00
Andras Bacsai
fb2c7896b3 update packages 2022-10-02 08:43:36 +00:00
Andras Bacsai
23265d9091 revert last changes 2022-10-02 10:38:08 +02:00
Andras Bacsai
2c9bb0e767 disable stuff 2022-10-01 13:58:50 +00:00
Andras Bacsai
f9e8400d83 temporary disable schedulers 2022-10-01 13:46:52 +00:00
Andras Bacsai
927a13cd76 temporary enable inspector 2022-10-01 13:03:55 +00:00
Andras Bacsai
51b3293e69 ui: inprogress version of iam 2022-09-29 15:46:52 +02:00
Andras Bacsai
3f76cadea9 fix: cleanup stucked tcp proxies 2022-09-29 14:44:20 +02:00
Andras Bacsai
6dbf53b558 chore: version++ 2022-09-29 14:32:55 +02:00
Andras Bacsai
22e937c798 fix: do not start tcp proxy without main container 2022-09-29 14:32:35 +02:00
Andras Bacsai
ac5cc8b299 Merge pull request #643 from coollabsio/next
v3.10.12
2022-09-29 14:09:37 +02:00
Andras Bacsai
c588ab723b fix: show logs better 2022-09-29 13:57:52 +02:00
Andras Bacsai
4b2dfc051d typo 2022-09-29 13:47:15 +02:00
Andras Bacsai
5238c83f3f fix: initial deploy status 2022-09-29 13:23:38 +02:00
Andras Bacsai
90bb580e50 ui: fixes 2022-09-29 13:23:29 +02:00
Andras Bacsai
f40e142704 ui: fix 2022-09-29 13:15:19 +02:00
Andras Bacsai
a67618675d fix: default buildImage and baseBuildImage 2022-09-29 13:15:16 +02:00
Andras Bacsai
4fe436e4d1 fix: dashboard statuses 2022-09-29 13:02:10 +02:00
Andras Bacsai
683b8c966f feat: cleanup unconfigured services and databases 2022-09-28 15:41:20 +02:00
Andras Bacsai
28377a156d feat: cleanup unconfigured applications 2022-09-28 11:45:02 +02:00
Andras Bacsai
3dcc4faabb Merge pull request #642 from coollabsio/next
v3.10.11
2022-09-28 11:18:01 +02:00
Andras Bacsai
60a033f93a ui: fix 2022-09-28 11:16:35 +02:00
Andras Bacsai
436bd73786 fix: baseDirectory 2022-09-28 11:14:23 +02:00
Andras Bacsai
5c69ff3339 fix: do not get status of more than 10 resources defined by category 2022-09-28 10:59:58 +02:00
Andras Bacsai
2105b1e7c4 ux: hasura console notification 2022-09-28 10:55:08 +02:00
Andras Bacsai
523004e5b2 chore: version++ 2022-09-28 10:54:57 +02:00
Andras Bacsai
5e02c386ec Merge pull request #641 from coollabsio/next
v3.10.10
2022-09-28 10:49:19 +02:00
Andras Bacsai
b4501fe52d ui: beta flag 2022-09-28 10:41:32 +02:00
Andras Bacsai
3c29eaa1b1 ui: small fix 2022-09-28 10:35:47 +02:00
Andras Bacsai
ee67e163b1 feat: system-wide github apps 2022-09-28 10:34:27 +02:00
Andras Bacsai
9662bc29fb ui: fix gitlab importer view 2022-09-28 09:56:27 +02:00
Andras Bacsai
96f2660b98 ui: loading button 2022-09-28 09:47:05 +02:00
Andras Bacsai
20f594c66c chore: version++ 2022-09-28 09:30:57 +02:00
Andras Bacsai
2b8d59dca3 Merge pull request #637 from coollabsio/next
v3.10.9
2022-09-28 09:29:31 +02:00
Andras Bacsai
d44047d109 ui: dev logs 2022-09-28 09:19:51 +02:00
Andras Bacsai
57c4d33bd3 ui: main resource search 2022-09-28 09:07:59 +02:00
Andras Bacsai
7a5377efe0 ui: resource button fix 2022-09-28 09:07:46 +02:00
Andras Bacsai
91e7cffccc fix: only log things to console in dev mode 2022-09-28 08:39:59 +02:00
Andras Bacsai
df31e47313 fix: disable development low disk space 2022-09-28 08:39:33 +02:00
Andras Bacsai
cb9586270c fix: able to delete apps in unconfigured state 2022-09-27 09:27:28 +00:00
Andras Bacsai
21dfa5227c fix: logs in docker bp 2022-09-26 20:37:58 +00:00
Andras Bacsai
9d15d2be77 chore: version++ 2022-09-26 20:22:08 +00:00
Andras Bacsai
929c02d31f ui: fix basedirectory meaning 2022-09-26 20:21:41 +00:00
Andras Bacsai
846185dd42 Merge pull request #635 from coollabsio/next
v3.10.8
2022-09-26 21:36:54 +02:00
Andras Bacsai
7bc2299a8e rename grafana dashboard to grafana 2022-09-26 19:24:13 +00:00
Andras Bacsai
d40e131bd8 fix: appwrite function network is not the default 2022-09-26 19:20:41 +00:00
Andras Bacsai
552c7297bf ui: service fixes 2022-09-26 19:00:36 +00:00
Andras Bacsai
3f5fd23955 ui: fix button 2022-09-26 18:57:24 +00:00
Andras Bacsai
8b8566251e chore: version++ 2022-09-26 18:49:05 +00:00
Andras Bacsai
6db47def8e fix: service logs 2022-09-26 18:48:22 +00:00
Andras Bacsai
1d0edc7b25 Merge pull request #634 from coollabsio/next
v3.10.7
2022-09-26 15:43:27 +02:00
Andras Bacsai
f9a417638a fix: seed 2022-09-26 13:42:19 +00:00
Andras Bacsai
984fe01551 Merge pull request #632 from coollabsio/next
v3.10.6
2022-09-26 13:43:49 +02:00
Andras Bacsai
d0cb350687 fix: error notification 2022-09-26 13:37:06 +02:00
Andras Bacsai
5f51011ce1 fix: empty preview value 2022-09-26 13:36:54 +02:00
Andras Bacsai
9ca125ac55 fix: error notification 2022-09-26 13:36:47 +02:00
Andras Bacsai
360f4f8c27 fix: seed new preview secret types 2022-09-26 13:36:15 +02:00
Andras Bacsai
6501f71bd6 chore: version++ 2022-09-26 13:24:02 +02:00
Andras Bacsai
bf6b799dba Merge pull request #625 from coollabsio/next
v3.10.5
2022-09-26 11:23:01 +02:00
Andras Bacsai
5f57279283 fix: multiplex ssh and ssl copy 2022-09-26 11:15:14 +02:00
Andras Bacsai
5ed3565520 ui: Beta features 2022-09-26 10:31:52 +02:00
Andras Bacsai
513fa90b8a ui: fixes 2022-09-26 10:27:51 +02:00
Andras Bacsai
a4d9b9689b fix: laravel php chooser 2022-09-26 10:27:42 +02:00
Andras Bacsai
1c05c0dcbb ui: fix 2022-09-26 10:21:01 +02:00
Andras Bacsai
a1b49a3a6b fix: base directory & docker bp 2022-09-26 10:20:53 +02:00
Andras Bacsai
6f57298cbb fix: scp without host verification & cert copy 2022-09-26 09:52:04 +02:00
Andras Bacsai
d8ce673088 fix: debug log for bp 2022-09-25 08:00:53 +00:00
Andras Bacsai
4cd7af7a74 fix: stream logs for heroku bp 2022-09-25 07:58:52 +00:00
Andras Bacsai
49c61b5992 fix: allow basedirectory for heroku 2022-09-25 07:48:03 +00:00
Andras Bacsai
e44d0550d2 fix: basedirectory should be empty if null 2022-09-25 07:47:54 +00:00
Andras Bacsai
17f82109b6 fix: consider base directory in heroku bp 2022-09-25 07:47:37 +00:00
Andras Bacsai
2d8888ae9b ui: fixes 2022-09-23 20:01:30 +00:00
Andras Bacsai
4abe9c6fb2 feat: ssl certificate sets custom ssl for applications 2022-09-23 15:21:19 +02:00
Andras Bacsai
f9d94fa660 ui: fixes 2022-09-23 14:20:37 +02:00
Andras Bacsai
eaa13f4990 remove self-signed certs 2022-09-23 14:10:17 +02:00
Andras Bacsai
01fd5901fe ui: fixes
fix: secret saving process
2022-09-23 14:09:26 +02:00
Andras Bacsai
3d6adeffc4 ui: more UI improvements 2022-09-22 15:48:16 +02:00
Andras Bacsai
9066952759 ui: redesign applications & settings
fix: follow logs
2022-09-22 12:30:28 +02:00
Andras Bacsai
6dd7f6274a fix: error during saving logs 2022-09-22 12:30:04 +02:00
Andras Bacsai
7a8fe6d152 ui: settings view 2022-09-22 09:47:33 +02:00
Andras Bacsai
be507be3a9 feat: refresh resource status on dashboard 2022-09-22 09:47:25 +02:00
Andras Bacsai
657b97f190 ui: fix destination view 2022-09-22 09:09:31 +02:00
Andras Bacsai
9d7745cd9b fix: settings db requests 2022-09-22 09:04:44 +02:00
Andras Bacsai
3668f83693 fix: not found redirect 2022-09-22 09:04:32 +02:00
Andras Bacsai
a2d5d99c1f fix: able to search with id 2022-09-22 09:03:34 +02:00
Andras Bacsai
f379ef6a3b feat: ssl cert on traefik config 2022-09-22 09:03:19 +02:00
Andras Bacsai
510a748749 fix: multiplex ssh connections 2022-09-22 09:02:53 +02:00
Andras Bacsai
550150d685 fix: db migration 2022-09-22 09:02:39 +02:00
Andras Bacsai
011ea9659e fix: ssl certificate distribution 2022-09-22 09:02:27 +02:00
Andras Bacsai
6eca7d948e add migration 2022-09-21 15:48:49 +02:00
Andras Bacsai
90e639f119 feat: custom certificate 2022-09-21 15:48:32 +02:00
Andras Bacsai
86ac6461d1 fix: dropdown 2022-09-20 13:33:35 +00:00
Andras Bacsai
18a95bf9ab ui: improvements 2022-09-20 15:06:33 +02:00
Andras Bacsai
7949bbe66d Merge branch 'temp' into next 2022-09-20 14:58:25 +02:00
Andras Bacsai
4b603c452a Merge pull request #602 from c0ldfront/trilium-notes-service
add trilium-notes-service
2022-09-20 14:54:53 +02:00
Andras Bacsai
837f0634b6 Merge pull request #606 from c0ldfront/grafana-service
add grafana-dashboard-service
2022-09-20 14:51:06 +02:00
Andras Bacsai
78076f7854 Merge branch 'next' into grafana-service 2022-09-20 14:50:37 +02:00
Andras Bacsai
719350cee1 Merge pull request #614 from c0ldfront/minio-login-issue
fix: MinIO invalid login
2022-09-20 14:49:41 +02:00
Andras Bacsai
4f6be3e6f5 revert: show usage everytime 2022-09-20 14:37:29 +02:00
Andras Bacsai
8e61e9fecb feat: Add migration button to appwrite 2022-09-20 14:36:06 +02:00
Andras Bacsai
2083285d78 version correction 2022-09-20 14:34:54 +02:00
Andras Bacsai
034e86e2cb ui: small logs on mobile 2022-09-20 13:07:49 +02:00
Andras Bacsai
f4a2d5c652 ui: dropdown as infobox 2022-09-19 14:01:48 +00:00
Andras Bacsai
534ccd6bf6 ui: fix git icon 2022-09-19 13:52:41 +00:00
Andras Bacsai
c17064f853 ui: fixes 2022-09-19 14:05:25 +02:00
Andras Bacsai
1e1566082f fix: tooltip 2022-09-19 12:14:14 +02:00
Andras Bacsai
449548654d Merge branch 'ui' into next 2022-09-19 12:06:00 +02:00
Andras Bacsai
6fc99524f0 ui: responsive! 2022-09-19 12:05:47 +02:00
Andras Bacsai
051629fad3 fix: ui 2022-09-19 08:57:55 +02:00
Andras Bacsai
f957008c1c Merge branch 'main' into ui 2022-09-19 08:57:48 +02:00
Andras Bacsai
98e1deec88 Merge pull request #612 from kaname-png/some-tweaks
feat(routes): ui for mobile and fixes
2022-09-19 08:54:23 +02:00
Andras Bacsai
99127652af fix: undead endpoint does not require JWT 2022-09-19 08:53:36 +02:00
Andras Bacsai
e9b9e9e82c fix: Appwrite default version 1.0 2022-09-19 08:53:22 +02:00
Andras Bacsai
2ed5c3746e chore: version++ 2022-09-19 08:53:11 +02:00
Andras Bacsai
8902056fdb Merge pull request #622 from coollabsio/next
v3.10.4
2022-09-19 08:31:54 +02:00
Andras Bacsai
defa6ff6e8 fix: update PR building status 2022-09-16 15:49:54 +02:00
Andras Bacsai
eed44e81be fix: await #2 2022-09-16 15:20:45 +02:00
Andras Bacsai
1951aec5ec fix: await 2022-09-16 15:20:17 +02:00
Andras Bacsai
9c4e0b4107 fix: get building status 2022-09-16 15:20:02 +02:00
Andras Bacsai
c8deac660d fix: appwrite?! 2022-09-16 14:49:01 +02:00
Andras Bacsai
4cc5ec9bd0 remove line 2022-09-16 14:38:33 +02:00
Andras Bacsai
c41bef2e81 Traefik add / pathprefix 2022-09-16 14:30:32 +02:00
Andras Bacsai
5b735cf960 fix: versions of appwrite 2022-09-16 14:30:25 +02:00
Andras Bacsai
604e960aa9 ui: fix buttons 2022-09-16 14:07:43 +02:00
Andras Bacsai
6c465aa1f2 feat: show remote servers 2022-09-16 14:07:35 +02:00
Andras Bacsai
c266832fdc ui: fix cleanup button 2022-09-16 14:07:28 +02:00
Andras Bacsai
906d8d0413 update contribution guide 2022-09-16 09:26:48 +02:00
Andras Bacsai
cb05fd4a3c fix: build logs 2022-09-16 09:06:45 +02:00
Kaname
2eda24799b Merge branch 'ui' into some-tweaks 2022-09-15 10:52:31 -06:00
Andras Bacsai
41e221f0cb fix: load more 2022-09-15 14:45:57 +02:00
Andras Bacsai
f75af035bb fix: logging 2022-09-15 14:27:55 +02:00
Andras Bacsai
e9e6449edf fix: canceling build 2022-09-15 14:05:33 +02:00
Andras Bacsai
f09d76da35 fix: changing build log without reload
fix: elapsed time of a build
2022-09-15 13:57:20 +02:00
Andras Bacsai
40dfe0919b hm 2022-09-15 11:15:07 +02:00
Andras Bacsai
85990dd074 fix: fluentbit and logs 2022-09-15 10:56:19 +02:00
Andras Bacsai
38acc16e1c fix: coolify update 2022-09-15 10:10:38 +02:00
Andras Bacsai
b7cc4c1e92 fix: fluentbit configuration 2022-09-15 10:08:17 +02:00
Andras Bacsai
1f232d96d8 differentiate between db logs and not 2022-09-15 10:03:38 +02:00
Andras Bacsai
83508f165d fix compose 2022-09-15 09:54:40 +02:00
Andras Bacsai
7cc58e7e84 Coolify fluent-bit image 2022-09-15 09:47:54 +02:00
Andras Bacsai
31d9740aac fix: fallback to db logs 2022-09-15 09:35:50 +02:00
Andras Bacsai
69891a64a0 feat: fluentbit 2022-09-15 09:34:39 +02:00
Andras Bacsai
0940309600 init for pgdb 2022-09-14 13:39:14 +02:00
Andras Bacsai
a762b1ed60 fix: updateMany build logs 2022-09-14 13:19:55 +02:00
Andras Bacsai
1b9d9d3a8b fix: dev url 2022-09-14 10:50:12 +02:00
Andras Bacsai
d9908b3d61 feat: previewApplications finalized 2022-09-13 15:50:20 +02:00
Andras Bacsai
c40b80436a fix: login 2022-09-13 10:14:31 +02:00
Andras Bacsai
8f1e352bcc ui: fix plausible 2022-09-13 10:14:25 +02:00
Andras Bacsai
18e769b5e5 fix: plausible analytics actions 2022-09-13 10:14:15 +02:00
Andras Bacsai
27af6459b3 feat: previewapplications init 2022-09-13 07:57:57 +00:00
Kaname
2c4bfab01a chore: whoops 2022-09-11 17:56:12 -06:00
Kaname
e689be552b Merge branch 'next' into some-tweaks 2022-09-11 17:54:04 -06:00
Andras Bacsai
ad80e7f48b Merge pull request #619 from coollabsio/next
v3.10.3
2022-09-11 17:01:22 +02:00
Andras Bacsai
d81b75b084 fix: umami init sql 2022-09-11 14:50:23 +00:00
Andras Bacsai
90f1431047 Merge pull request #617 from coollabsio/next
v3.10.2
2022-09-11 14:45:30 +02:00
Andras Bacsai
61ea7dabae fix: show error logs 2022-09-11 12:38:58 +00:00
Andras Bacsai
5d9f5f4a7d fix: gitlab importer for public repos 2022-09-11 12:38:50 +00:00
Andras Bacsai
f956f612d3 fixes 2022-09-11 12:20:01 +00:00
Andras Bacsai
3f5108268d Merge pull request #618 from coollabsio/draft/nostalgic-allen
Draft/nostalgic allen
2022-09-11 14:16:54 +02:00
Andras Bacsai
4c0dfc3f30 Merge branch 'next' into draft/nostalgic-allen 2022-09-11 14:16:26 +02:00
Andras Bacsai
1670fe9b1c Update supportedVersions.ts 2022-09-11 12:14:27 +00:00
Andras Bacsai
300b28c0f2 move reset queue button to build logs 2022-09-11 12:11:48 +00:00
Andras Bacsai
e7038961ef Merge branch 'next' into some-tweaks 2022-09-11 13:56:07 +02:00
Andras Bacsai
24e77a5211 Merge pull request #615 from ArticaDev/main
fix: changing umami image URL to get latest version
2022-09-11 13:54:11 +02:00
Andras Bacsai
9df039fbc2 Merge pull request #616 from coollabsio/draft/nostalgic-allen
feat: Add queue reset button
2022-09-11 13:53:15 +02:00
Andras Bacsai
143cd46a81 feat: Add queue reset button 2022-09-11 11:51:43 +00:00
LL
680e9871ed fix: changing umami image URL to get latest version 2022-09-10 23:32:36 -03:00
David Mydlarz
d5ece58f71 MINIO_SERVER_URL -> apiFqdn 2022-09-11 09:07:31 +09:00
Kaname
d7bbb5c4b7 chore: minor changes 2022-09-10 19:14:41 +00:00
Kaname
cf9c991c79 chore: minor changes 2022-09-10 19:03:09 +00:00
Kaname
0f0d96195d fix(routes): ui from secrets table 2022-09-10 19:00:43 +00:00
Kaname
3a562bb714 feat(routes): improve ui for apps, databases and services logs 2022-09-10 17:45:47 +00:00
Kaname
6381ba8478 fix(routes): header of settings page in databases 2022-09-10 17:27:48 +00:00
Kaname
9e3c14841a fix: ui with headers 2022-09-10 17:23:55 +00:00
Kaname
1917091338 Merge remote-tracking branch 'origin' into some-tweaks 2022-09-10 17:15:18 +00:00
Kaname
b1bb508554 Merge branch 'next' into some-tweaks 2022-09-10 17:14:44 +00:00
Andras Bacsai
0a68a48fc5 Merge pull request #611 from coollabsio/next
v3.10.1
2022-09-10 10:47:47 +02:00
Andras Bacsai
d3af6792d0 remove debug 2022-09-10 08:46:31 +00:00
Andras Bacsai
44dc3b743e add debug 2022-09-10 08:35:31 +00:00
Andras Bacsai
b469d2832d fix: delete resource use window location 2022-09-10 07:58:56 +00:00
Andras Bacsai
d844026c29 dev: arm should be on next all the time 2022-09-10 07:41:36 +00:00
Andras Bacsai
21b4990652 ui: fix follow button 2022-09-10 07:29:51 +00:00
Andras Bacsai
39e24bdc97 remove console logs 2022-09-10 07:17:20 +00:00
Andras Bacsai
bc66b98176 fix: build secrets for apps 2022-09-10 07:14:30 +00:00
Andras Bacsai
d6d3fb46cc fix: volumes for services 2022-09-10 07:14:17 +00:00
Kaname
4040b334f5 fix(routes): more ui tweaks 2022-09-10 01:54:59 +00:00
Kaname
d7e72519ef fix(routes): more ui tweaks 2022-09-10 01:30:07 +00:00
Kaname
c7752f0be9 fix(routes): more ui tweaks 2022-09-10 01:27:48 +00:00
Kaname
0ffe28a733 fix(routes): more ui tweaks 2022-09-10 01:23:17 +00:00
Kaname
56f24fe317 feat(styles): make header css component 2022-09-10 01:13:44 +00:00
Kaname
341cde2781 Merge branch 'next' into some-tweaks 2022-09-10 01:07:11 +00:00
Kaname
33bb8d434d feat(ui): improve header of pages 2022-09-10 00:16:49 +00:00
Kaname
9f813b7385 fix: github conflicts 2022-09-10 00:05:19 +00:00
Kaname
02a336a25d Merge remote-tracking branch 'origin' into some-tweaks 2022-09-09 23:57:03 +00:00
Andras Bacsai
88ed1446f4 fix: secrets for PR 2022-09-09 15:46:47 +02:00
Andras Bacsai
c69312f128 fix: remove unnecessary gitlab group name 2022-09-09 15:19:14 +02:00
Andras Bacsai
c5bcff0e10 fix: show restarting application & logs 2022-09-09 15:14:58 +02:00
Andras Bacsai
871d1e2440 ui: fix button 2022-09-09 15:14:19 +02:00
Andras Bacsai
1619afb938 chore: version++ 2022-09-09 14:49:13 +02:00
Andras Bacsai
25528913f1 fix: show restarting apps 2022-09-09 14:48:58 +02:00
David Mydlarz
7df532fa72 Grafana Dashboard service completed 2022-09-09 00:17:47 +09:00
David Mydlarz
1f40c2ccf8 add trilium-notes-service 2022-09-08 17:32:11 +09:00
Kaname
4a8fd309c5 fix(routes): searchbar ui 2022-09-07 17:59:22 +00:00
Kaname
b416849d9c feat: re-apply ui improves 2022-09-07 17:14:29 +00:00
Kaname
bc321d8ced Merge branch 'next' into some-tweaks 2022-09-07 17:14:09 +00:00
Kaname
45919fc0cf fix(routes): duplicates classes in services page 2022-09-07 02:09:12 +00:00
Kaname
dd6f4c4844 fix(routes): ui from settings page 2022-09-07 02:03:48 +00:00
Kaname
bb47db033f fix(routes): more ui tweaks 2022-09-07 01:47:43 +00:00
Kaname
111ea78693 fix(routes): more ui tweaks 2022-09-07 01:47:04 +00:00
Kaname
c17253589a fix(routes): more ui tweaks 2022-09-07 01:45:05 +00:00
Kaname
7e6156f5dd fix(routes): more ui tweaks 2022-09-07 01:29:48 +00:00
Kaname
d5cfb63f52 fix(routes): ui from services page 2022-09-07 00:50:34 +00:00
Kaname
cab15055e7 fix(routes): ui from databases page 2022-09-07 00:22:56 +00:00
Kaname
9185910171 fix(routes): ui from databases page 2022-09-07 00:20:15 +00:00
Kaname
b4892e0caf fix(routes): ui from databases page 2022-09-07 00:16:57 +00:00
Kaname
83e0cafef9 chore: minor changes 2022-09-06 23:46:11 +00:00
Kaname
7cb75506c3 fix(routes): ui from destinations page 2022-09-06 23:43:11 +00:00
Kaname
ac6970ad40 fix(routes): improve design of git sources page 2022-09-06 19:06:27 +00:00
Kaname
5a95cc236c fix(routes): improve design of application page 2022-09-06 18:51:19 +00:00
Kaname
95c942f477 feat(layout): added drawer when user is in mobile 2022-09-06 17:37:26 +00:00
276 changed files with 16224 additions and 13475 deletions

View File

@@ -5,7 +5,7 @@ on:
types: [released]
jobs:
arm64-build:
arm64:
runs-on: [self-hosted, arm64]
steps:
- name: Checkout
@@ -31,7 +31,7 @@ jobs:
tags: coollabsio/coolify:${{steps.package-version.outputs.current-version}}-arm64
cache-from: type=registry,ref=coollabsio/coolify:buildcache-arm64
cache-to: type=registry,ref=coollabsio/coolify:buildcache-arm64,mode=max
amd64-build:
amd64:
runs-on: ubuntu-latest
steps:
- name: Checkout
@@ -57,9 +57,35 @@ jobs:
tags: coollabsio/coolify:${{steps.package-version.outputs.current-version}}-amd64
cache-from: type=registry,ref=coollabsio/coolify:buildcache-amd64
cache-to: type=registry,ref=coollabsio/coolify:buildcache-amd64,mode=max
aarch64:
runs-on: [self-hosted, arm64]
steps:
- name: Checkout
uses: actions/checkout@v3
- name: Set up QEMU
uses: docker/setup-qemu-action@v1
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v1
- name: Login to DockerHub
uses: docker/login-action@v1
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Get current package version
uses: martinbeentjes/npm-get-version-action@v1.2.3
id: package-version
- name: Build and push
uses: docker/build-push-action@v2
with:
context: .
platforms: linux/aarch64
push: true
tags: coollabsio/coolify:${{steps.package-version.outputs.current-version}}-aarch64
cache-from: type=registry,ref=coollabsio/coolify:buildcache-aarch64
cache-to: type=registry,ref=coollabsio/coolify:buildcache-aarch64,mode=max
merge-manifest:
runs-on: ubuntu-latest
needs: [amd64-build, arm64-build]
needs: [amd64, arm64, aarch64]
steps:
- name: Checkout
uses: actions/checkout@v3
@@ -77,7 +103,7 @@ jobs:
id: package-version
- name: Create & publish manifest
run: |
docker manifest create coollabsio/coolify:${{steps.package-version.outputs.current-version}} --amend coollabsio/coolify:${{steps.package-version.outputs.current-version}}-amd64 --amend coollabsio/coolify:${{steps.package-version.outputs.current-version}}-arm64
docker manifest create coollabsio/coolify:${{steps.package-version.outputs.current-version}} --amend coollabsio/coolify:${{steps.package-version.outputs.current-version}}-amd64 --amend coollabsio/coolify:${{steps.package-version.outputs.current-version}}-arm64 --amend coollabsio/coolify:${{steps.package-version.outputs.current-version}}-aarch64
docker manifest push coollabsio/coolify:${{steps.package-version.outputs.current-version}}
- uses: sarisia/actions-status-discord@v1
if: always()

View File

@@ -6,7 +6,7 @@ on:
- next
jobs:
arm64-making-something-cool:
arm64:
runs-on: [self-hosted, arm64]
steps:
- name: Checkout
@@ -34,7 +34,7 @@ jobs:
tags: coollabsio/coolify:next-arm64
cache-from: type=registry,ref=coollabsio/coolify:buildcache-next-arm64
cache-to: type=registry,ref=coollabsio/coolify:buildcache-next-arm64,mode=max
amd64-making-something-cool:
amd64:
runs-on: ubuntu-latest
steps:
- name: Checkout
@@ -59,12 +59,12 @@ jobs:
context: .
platforms: linux/amd64
push: true
tags: coollabsio/coolify:next-amd64,coollabsio/coolify:next-test
tags: coollabsio/coolify:next-amd64
cache-from: type=registry,ref=coollabsio/coolify:buildcache-next-amd64
cache-to: type=registry,ref=coollabsio/coolify:buildcache-next-amd64,mode=max
merge-manifest-to-be-cool:
merge-manifest:
runs-on: ubuntu-latest
needs: [arm64-making-something-cool, amd64-making-something-cool]
needs: [arm64, amd64]
steps:
- name: Checkout
uses: actions/checkout@v3

4
.gitignore vendored
View File

@@ -12,4 +12,6 @@ client
apps/api/db/*.db
local-serve
apps/api/db/migration.db-journal
apps/api/core*
apps/api/core*
logs
others/certificates

View File

@@ -1,256 +0,0 @@
# 👋 Welcome
First of all, thank you for considering contributing to my project! It means a lot 💜.
## 🙋 Want to help?
If you begin in GitHub contribution, you can find the [first contribution](https://github.com/firstcontributions/first-contributions) and follow this guide.
Follow the [introduction](#introduction) to get started then start contributing!
This is a little list of what you can do to help the project:
- [🧑‍💻 Develop your own ideas](#developer-contribution)
- [🌐 Translate the project](#translation)
## 👋 Introduction
### Setup with Github codespaces
If you have github codespaces enabled then you can just create a codespace and run `pnpm dev` to run your the dev environment. All the required dependencies and packages has been configured for you already.
### Setup with Gitpod
If you have a [Gitpod](https://gitpod.io), you can just create a workspace from this repository, run `pnpm install && pnpm db:push && pnpm db:seed` and then `pnpm dev`. All the required dependencies and packages has been configured for you already.
### Setup locally in your machine
> 🔴 At the moment, Coolify **doesn't support Windows**. You must use Linux or MacOS. Consider using Gitpod or Github Codespaces.
#### Recommended Pull Request Guideline
- Fork the project
- Clone your fork repo to local
- Create a new branch
- Push to your fork repo
- Create a pull request: https://github.com/coollabsio/coolify/compare
- Write a proper description
- Open the pull request to review against `next` branch
---
# 🧑‍💻 Developer contribution
## Technical skills required
- **Languages**: Node.js / Javascript / Typescript
- **Framework JS/TS**: [SvelteKit](https://kit.svelte.dev/) & [Fastify](https://www.fastify.io/)
- **Database ORM**: [Prisma.io](https://www.prisma.io/)
- **Docker Engine API**
---
## How to start after you set up your local fork?
### Prerequisites
1. Due to the lock file, this repository is best with [pnpm](https://pnpm.io). I recommend you try and use `pnpm` because it is cool and efficient!
2. You need to have [Docker Engine](https://docs.docker.com/engine/install/) installed locally.
3. You need to have [Docker Compose Plugin](https://docs.docker.com/compose/install/compose-plugin/) installed locally.
4. You need to have [GIT LFS Support](https://git-lfs.github.com/) installed locally.
Optional:
4. To test Heroku buildpacks, you need [pack](https://github.com/buildpacks/pack) binary installed locally.
### Steps for local setup
1. Copy `apps/api/.env.template` to `apps/api/.env.template` and set the `COOLIFY_APP_ID` environment variable to something cool.
2. Install dependencies with `pnpm install`.
3. Need to create a local SQlite database with `pnpm db:push`.
This will apply all migrations at `db/dev.db`.
4. Seed the database with base entities with `pnpm db:seed`
5. You can start coding after starting `pnpm dev`.
---
## Database migrations
During development, if you change the database layout, you need to run `pnpm db:push` to migrate the database and create types for Prisma. You also need to restart the development process.
If the schema is finalized, you need to create a migration file with `pnpm db:migrate <nameOfMigration>` where `nameOfMigration` is given by you. Make it sense. :)
---
## How to add new services
You can add any open-source and self-hostable software (service/application) to Coolify if the following statements are true:
- Self-hostable (obviously)
- Open-source
- Maintained (I do not want to add software full of bugs)
## Backend
There are 5 steps you should make on the backend side.
1. Create Prisma / database schema for the new service.
2. Add supported versions of the service.
3. Update global functions.
4. Create API endpoints.
5. Define automatically generated variables.
> I will use [Umami](https://umami.is/) as an example service.
### Create Prisma / Database schema for the new service.
You only need to do this if you store passwords or any persistent configuration. Mostly it is required by all services, but there are some exceptions, like NocoDB.
Update Prisma schema in [prisma/schema.prisma](prisma/schema.prisma).
- Add new model with the new service name.
- Make a relationship with `Service` model.
- In the `Service` model, the name of the new field should be with low-capital.
- If the service needs a database, define a `publicPort` field to be able to make it's database public, example field name in case of PostgreSQL: `postgresqlPublicPort`. It should be a optional field.
If you are finished with the Prisma schema, you should update the database schema with `pnpm db:push` command.
> You must restart the running development environment to be able to use the new model
> If you use VSCode/TLS, you probably need to restart the `Typescript Language Server` to get the new types loaded in the running environment.
### Add supported versions
Supported versions are hardcoded into Coolify (for now).
You need to update `supportedServiceTypesAndVersions` function at [apps/api/src/lib/services/supportedVersions.ts](apps/api/src/lib/services/supportedVersions.ts). Example JSON:
```js
{
// Name used to identify the service internally
name: 'umami',
// Fancier name to show to the user
fancyName: 'Umami',
// Docker base image for the service
baseImage: 'ghcr.io/mikecao/umami',
// Optional: If there is any dependent image, you should list it here
images: [],
// Usable tags
versions: ['postgresql-latest'],
// Which tag is the recommended
recommendedVersion: 'postgresql-latest',
// Application's default port, Umami listens on 3000
ports: {
main: 3000
}
}
```
### Add required functions/properties
1. Add the new service to the `includeServices` variable in [apps/api/src/lib/services/common.ts](apps/api/src/lib/services/common.ts), so it will be included in all places in the database queries where it is required.
```js
const include: any = {
destinationDocker: true,
persistentStorage: true,
serviceSecret: true,
minio: true,
plausibleAnalytics: true,
vscodeserver: true,
wordpress: true,
ghost: true,
meiliSearch: true,
umami: true // This line!
};
```
2. Update the database update query with the new service type to `configureServiceType` function in [apps/api/src/lib/services/common.ts](apps/api/src/lib/services/common.ts). This function defines the automatically generated variables (passwords, users, etc.) and it's encryption process (if applicable).
```js
[...]
else if (type === 'umami') {
const postgresqlUser = cuid();
const postgresqlPassword = encrypt(generatePassword());
const postgresqlDatabase = 'umami';
const hashSalt = encrypt(generatePassword(64));
await prisma.service.update({
where: { id },
data: {
type,
umami: {
create: {
postgresqlDatabase,
postgresqlPassword,
postgresqlUser,
hashSalt,
}
}
}
});
}
```
3. Add field details to [apps/api/src/lib/services/serviceFields.ts](apps/api/src/lib/services/serviceFields.ts), so every component will know what to do with the values (decrypt/show it by default/readonly)
```js
export const umami = [{
name: 'postgresqlUser',
isEditable: false,
isLowerCase: false,
isNumber: false,
isBoolean: false,
isEncrypted: false
}]
```
4. Add service deletion query to `removeService` function in [apps/api/src/lib/services/common.ts](apps/api/src/lib/services/common.ts)
5. Add start process for the new service in [apps/api/src/lib/services/handlers.ts](apps/api/src/lib/services/handlers.ts)
> See startUmamiService() function as example.
6. Add the newly added start process to `startService` in [apps/api/src/routes/api/v1/services/handlers.ts](apps/api/src/routes/api/v1/services/handlers.ts)
7. You need to add a custom logo at [apps/ui/src/lib/components/svg/services](apps/ui/src/lib/components/svg/services) as a svelte component and export it in [apps/ui/src/lib/components/svg/services/index.ts](apps/ui/src/lib/components/svg/services/index.ts)
SVG is recommended, but you can use PNG as well. It should have the `isAbsolute` variable with the suitable CSS classes, primarily for sizing and positioning.
8. You need to include it the logo at:
- [apps/ui/src/lib/components/svg/services/ServiceIcons.svelte](apps/ui/src/lib/components/svg/services/ServiceIcons.svelte) with `isAbsolute`.
- [apps/ui/src/routes/services/[id]/_ServiceLinks.svelte](apps/ui/src/routes/services/[id]/_ServiceLinks.svelte) with the link to the docs/main site of the service
9. By default the URL and the name frontend forms are included in [apps/ui/src/routes/services/[id]/_Services/_Services.svelte](apps/ui/src/routes/services/[id]/_Services/_Services.svelte).
If you need to show more details on the frontend, such as users/passwords, you need to add Svelte component to [apps/ui/src/routes/services/[id]/_Services](apps/ui/src/routes/services/[id]/_Services) with an underscore.
> For example, see other [here](apps/ui/src/routes/services/[id]/_Services/_Umami.svelte).
Good job! 👏
<!-- # 🌐 Translate the project
The project use [sveltekit-i18n](https://github.com/sveltekit-i18n/lib) to translate the project.
It follows the [ISO 639-1](https://en.wikipedia.org/wiki/List_of_ISO_639-1_codes) to name languages.
### Installation
You must have gone throw all the [intro](#introduction) steps before you can start translating.
It's only an advice, but I recommend you to use:
- Visual Studio Code
- [i18n Ally for Visual Studio Code](https://marketplace.visualstudio.com/items?itemName=Lokalise.i18n-ally): ideal to see the progress of the translation.
- [Svelte for VS Code](https://marketplace.visualstudio.com/items?itemName=svelte.svelte-vscode): to get the syntax color for the project
### Adding a language
If your language doesn't appear in the [locales folder list](src/lib/locales/), follow the step below:
1. In `src/lib/locales/`, Copy paste `en.json` and rename it with your language (eg: `cz.json`).
2. In the [lang.json](src/lib/lang.json) file, add a line after the first bracket (`{`) with `"ISO of your language": "Language",` (eg: `"cz": "Czech",`).
3. Have fun translating! -->

57
CONTRIBUTION.md Normal file
View File

@@ -0,0 +1,57 @@
# Contribution
First, thanks for considering to contribute to my project. It really means a lot! :)
You can ask for guidance anytime on our Discord server in the #contribution channel.
## Setup your development environment
### Container based development flow (recommended and the easiest)
All you need is to intall [Docker Engine 20.11+](https://docs.docker.com/engine/install/) on your local machine and run `pnpm dev:container`. It will build the base image for Coolify and start the development server inside Docker. All required ports (3000, 3001) will be exposed to your host.
### Github codespaces
If you have github codespaces enabled then you can just create a codespace and run `pnpm dev` to run your the dev environment. All the required dependencies and packages has been configured for you already.
### Gitpod
1. Use [container based development flow](#container-based-development-flow-easiest)
2. Or setup your workspace manually:
Create a workspace from this repository, run `pnpm install && pnpm db:push && pnpm db:seed` and then `pnpm dev`. All the required dependencies and packages has been configured for you already.
> Some packages, just `pack` are not installed in this way. You cannot test all the features. Please use the [container based development flow](#container-based-development-flow-easiest).
### Local Machine
> At the moment, Coolify `doesn't support Windows`. You must use `Linux` or `MacOS` or consider using Gitpod or Github Codespaces.
Install all the prerequisites manually to your host system. If you would not like to install anything, I suggest to use the [container based development flow](#container-based-development-flow-easiest).
- Due to the lock file, this repository is best with [pnpm](https://pnpm.io). I recommend you try and use `pnpm` because it is cool and efficient!
- You need to have [Docker Engine](https://docs.docker.com/engine/install/) installed locally.
- You need to have [Docker Compose Plugin](https://docs.docker.com/compose/install/compose-plugin/) installed locally.
- You need to have [GIT LFS Support](https://git-lfs.github.com/) installed locally.
Optional:
- To test Heroku buildpacks, you need [pack](https://github.com/buildpacks/pack) binary installed locally.
### Inside a Docker container
`WIP`
## Setup Coolify
- Copy `apps/api/.env.template` to `apps/api/.env.template` and set the `COOLIFY_APP_ID` environment variable to something cool.
- `pnpm install` to install dependencies.
- `pnpm db:push` to o create a local SQlite database.
This will apply all migrations at `db/dev.db`.
- `pnpm db:seed` seed the database.
- `pnpm dev` start coding.
## Technical skills required
- **Languages**: Node.js / Javascript / Typescript
- **Framework JS/TS**: [SvelteKit](https://kit.svelte.dev/) & [Fastify](https://www.fastify.io/)
- **Database ORM**: [Prisma.io](https://www.prisma.io/)
- **Docker Engine API**
## How to add a new service?
You can find all details [here](https://github.com/coollabsio/coolify-community-templates)

View File

@@ -1,108 +0,0 @@
---
head:
- - meta
- name: description
content: Coolify - Databases
- - meta
- name: keywords
content: databases coollabs coolify
- - meta
- name: twitter:card
content: summary_large_image
- - meta
- name: twitter:site
content: '@andrasbacsai'
- - meta
- name: twitter:title
content: Coolify
- - meta
- name: twitter:description
content: An open-source & self-hostable Heroku / Netlify alternative.
- - meta
- name: twitter:image
content: https://cdn.coollabs.io/assets/coollabs/og-image-databases.png
- - meta
- property: og:type
content: website
- - meta
- property: og:url
content: https://coolify.io
- - meta
- property: og:title
content: Coolify
- - meta
- property: og:description
content: An open-source & self-hostable Heroku / Netlify alternative.
- - meta
- property: og:site_name
content: Coolify
- - meta
- property: og:image
content: https://cdn.coollabs.io/assets/coollabs/og-image-databases.png
---
# Contribution
First, thanks for considering to contribute to my project. It really means a lot! :)
You can ask for guidance anytime on our Discord server in the #contribution channel.
## Setup your development environment
### Github codespaces
If you have github codespaces enabled then you can just create a codespace and run `pnpm dev` to run your the dev environment. All the required dependencies and packages has been configured for you already.
### Gitpod
If you have a [Gitpod](https://gitpod.io), you can just create a workspace from this repository, run `pnpm install && pnpm db:push && pnpm db:seed` and then `pnpm dev`. All the required dependencies and packages has been configured for you already.
### Local Machine
> At the moment, Coolify `doesn't support Windows`. You must use `Linux` or `MacOS` or consider using Gitpod or Github Codespaces.
- Due to the lock file, this repository is best with [pnpm](https://pnpm.io). I recommend you try and use `pnpm` because it is cool and efficient!
- You need to have [Docker Engine](https://docs.docker.com/engine/install/) installed locally.
- You need to have [Docker Compose Plugin](https://docs.docker.com/compose/install/compose-plugin/) installed locally.
- You need to have [GIT LFS Support](https://git-lfs.github.com/) installed locally.
Optional:
- To test Heroku buildpacks, you need [pack](https://github.com/buildpacks/pack) binary installed locally.
### Inside a Docker container
`WIP`
## Setup Coolify
- Copy `apps/api/.env.template` to `apps/api/.env.template` and set the `COOLIFY_APP_ID` environment variable to something cool.
- `pnpm install` to install dependencies.
- `pnpm db:push` to o create a local SQlite database.
This will apply all migrations at `db/dev.db`.
- `pnpm db:seed` seed the database.
- `pnpm dev` start coding.
## Technical skills required
- **Languages**: Node.js / Javascript / Typescript
- **Framework JS/TS**: [SvelteKit](https://kit.svelte.dev/) & [Fastify](https://www.fastify.io/)
- **Database ORM**: [Prisma.io](https://www.prisma.io/)
- **Docker Engine API**
## Add a new service
### Which service is eligable to add to Coolify?
The following statements needs to be true:
- Self-hostable
- Open-source
- Maintained (I do not want to add software full of bugs)
### Create Prisma / Database schema for the new service.
All data that needs to be persist for a service should be saved to the database in `cleartext` or `encrypted`.
very password/api key/passphrase needs to be encrypted. If you are not sure, whether it should be encrypted or not, just encrypt it.
Update Prisma schema in [src/api/prisma/schema.prisma](https://github.com/coollabsio/coolify/blob/main/apps/api/prisma/schema.prisma).
- Add new model with the new service name.
- Make a relationship with `Service` model.
- In the `Service` model, the name of the new field should be with low-capital.
- If the service needs a database, define a `publicPort` field to be able to make it's database public, example field name in case of PostgreSQL: `postgresqlPublicPort`. It should be a optional field.

View File

@@ -1,5 +1,4 @@
ARG PNPM_VERSION=7.11.0
ARG NPM_VERSION=8.19.1
FROM node:18-slim as build
WORKDIR /app
@@ -17,26 +16,35 @@ WORKDIR /app
ENV NODE_ENV production
ARG TARGETPLATFORM
# https://download.docker.com/linux/static/stable/
ARG DOCKER_VERSION=20.10.18
# https://github.com/docker/compose/releases
# Reverted to 2.6.1 because of this https://github.com/docker/compose/issues/9704. 2.9.0 still has a bug.
ARG DOCKER_COMPOSE_VERSION=2.6.1
# https://github.com/buildpacks/pack/releases
ARG PACK_VERSION=v0.27.0
RUN apt update && apt -y install --no-install-recommends ca-certificates git git-lfs openssh-client curl jq cmake sqlite3 openssl psmisc python3
RUN apt-get clean autoclean && apt-get autoremove --yes && rm -rf /var/lib/{apt,dpkg,cache,log}/
RUN npm --no-update-notifier --no-fund --global install pnpm@${PNPM_VERSION}
RUN npm install -g npm@${PNPM_VERSION}
RUN mkdir -p ~/.docker/cli-plugins/
# https://download.docker.com/linux/static/stable/
RUN curl -SL https://cdn.coollabs.io/bin/$TARGETPLATFORM/docker-20.10.9 -o /usr/bin/docker
# https://github.com/docker/compose/releases
# Reverted to 2.6.1 because of this https://github.com/docker/compose/issues/9704. 2.9.0 still has a bug.
RUN curl -SL https://cdn.coollabs.io/bin/$TARGETPLATFORM/docker-compose-linux-2.6.1 -o ~/.docker/cli-plugins/docker-compose
RUN chmod +x ~/.docker/cli-plugins/docker-compose /usr/bin/docker
RUN (curl -sSL "https://github.com/buildpacks/pack/releases/download/v0.27.0/pack-v0.27.0-linux.tgz" | tar -C /usr/local/bin/ --no-same-owner -xzv pack)
RUN curl -SL https://cdn.coollabs.io/bin/$TARGETPLATFORM/docker-$DOCKER_VERSION -o /usr/bin/docker
RUN curl -SL https://cdn.coollabs.io/bin/$TARGETPLATFORM/docker-compose-linux-$DOCKER_COMPOSE_VERSION -o ~/.docker/cli-plugins/docker-compose
RUN curl -SL https://cdn.coollabs.io/bin/$TARGETPLATFORM/pack-$PACK_VERSION -o /usr/local/bin/pack
RUN chmod +x ~/.docker/cli-plugins/docker-compose /usr/bin/docker /usr/local/bin/pack
COPY --from=build /app/apps/api/build/ .
COPY --from=build /app/others/fluentbit/ ./fluentbit
COPY --from=build /app/apps/ui/build/ ./public
COPY --from=build /app/apps/api/prisma/ ./prisma
COPY --from=build /app/apps/api/package.json .
COPY --from=build /app/docker-compose.yaml .
COPY --from=build /app/apps/api/tags.json .
COPY --from=build /app/apps/api/templates.json .
RUN pnpm install -p

31
Dockerfile-dev Normal file
View File

@@ -0,0 +1,31 @@
FROM node:18-slim
ENV NODE_ENV development
ARG TARGETPLATFORM
ARG PNPM_VERSION=7.11.0
ARG NPM_VERSION=8.19.1
# https://download.docker.com/linux/static/stable/
ARG DOCKER_VERSION=20.10.18
# https://github.com/docker/compose/releases
# Reverted to 2.6.1 because of this https://github.com/docker/compose/issues/9704. 2.9.0 still has a bug.
ARG DOCKER_COMPOSE_VERSION=2.6.1
# https://github.com/buildpacks/pack/releases
ARG PACK_VERSION=v0.27.0
WORKDIR /app
RUN npm --no-update-notifier --no-fund --global install pnpm@${PNPM_VERSION}
RUN apt update && apt -y install --no-install-recommends ca-certificates git git-lfs openssh-client curl jq cmake sqlite3 openssl psmisc python3
RUN apt-get clean autoclean && apt-get autoremove --yes && rm -rf /var/lib/{apt,dpkg,cache,log}/
RUN npm --no-update-notifier --no-fund --global install pnpm@${PNPM_VERSION}
RUN npm install -g npm@${PNPM_VERSION}
RUN mkdir -p ~/.docker/cli-plugins/
RUN curl -SL https://cdn.coollabs.io/bin/$TARGETPLATFORM/docker-$DOCKER_VERSION -o /usr/bin/docker
RUN curl -SL https://cdn.coollabs.io/bin/$TARGETPLATFORM/docker-compose-linux-$DOCKER_COMPOSE_VERSION -o ~/.docker/cli-plugins/docker-compose
RUN curl -SL https://cdn.coollabs.io/bin/$TARGETPLATFORM/pack-$PACK_VERSION -o /usr/local/bin/pack
RUN chmod +x ~/.docker/cli-plugins/docker-compose /usr/bin/docker /usr/local/bin/pack
EXPOSE 3000
ENV CHECKPOINT_DISABLE=1

1
apps/api/devTags.json Normal file

File diff suppressed because one or more lines are too long

2786
apps/api/devTemplates.yaml Normal file

File diff suppressed because it is too large Load Diff

View File

@@ -3,41 +3,45 @@
"description": "Coolify's Fastify API",
"license": "Apache-2.0",
"scripts": {
"db:generate": "prisma generate",
"db:push": "prisma db push && prisma generate",
"db:seed": "prisma db seed",
"db:studio": "prisma studio",
"db:migrate": "COOLIFY_DATABASE_URL=file:../db/migration.db prisma migrate dev --skip-seed --name",
"dev": "nodemon",
"build": "rimraf build && esbuild `find src \\( -name '*.ts' \\)| grep -v client/` --platform=node --outdir=build --format=cjs",
"build": "rimraf build && esbuild `find src \\( -name '*.ts' \\)| grep -v client/` --minify=true --platform=node --outdir=build --format=cjs",
"format": "prettier --write 'src/**/*.{js,ts,json,md}'",
"lint": "prettier --check 'src/**/*.{js,ts,json,md}' && eslint --ignore-path .eslintignore .",
"start": "NODE_ENV=production npx -y prisma migrate deploy && npx prisma generate && npx prisma db seed && node index.js"
"start": "NODE_ENV=production pnpm prisma migrate deploy && pnpm prisma generate && pnpm prisma db seed && node index.js"
},
"dependencies": {
"@breejs/ts-worker": "2.0.0",
"@fastify/autoload": "5.3.1",
"@fastify/cookie": "8.1.0",
"@fastify/cors": "8.1.0",
"@fastify/autoload": "5.4.1",
"@fastify/cookie": "8.3.0",
"@fastify/cors": "8.1.1",
"@fastify/env": "4.1.0",
"@fastify/jwt": "6.3.2",
"@fastify/multipart": "7.3.0",
"@fastify/static": "6.5.0",
"@iarna/toml": "2.2.5",
"@ladjs/graceful": "3.0.2",
"@prisma/client": "4.3.1",
"axios": "0.27.2",
"@prisma/client": "4.5.0",
"bcryptjs": "2.4.3",
"bree": "9.1.2",
"cabin": "9.1.2",
"compare-versions": "5.0.1",
"csv-parse": "5.3.1",
"csvtojson": "2.0.10",
"cuid": "2.1.8",
"dayjs": "1.11.5",
"dayjs": "1.11.6",
"dockerode": "3.3.4",
"dotenv-extended": "2.9.0",
"execa": "6.1.0",
"fastify": "4.5.3",
"fastify-plugin": "4.2.1",
"fastify": "4.9.2",
"fastify-plugin": "4.3.0",
"fastify-socket.io": "4.0.0",
"generate-password": "1.7.0",
"got": "12.4.1",
"got": "12.5.2",
"is-ip": "5.0.0",
"is-port-reachable": "4.0.0",
"js-yaml": "4.1.0",
@@ -46,26 +50,29 @@
"node-os-utils": "1.3.7",
"p-all": "4.0.0",
"p-throttle": "5.0.0",
"prisma": "4.5.0",
"public-ip": "6.0.1",
"pump": "3.0.0",
"socket.io": "4.5.3",
"ssh-config": "4.1.6",
"strip-ansi": "7.0.1",
"unique-names-generator": "4.7.1"
},
"devDependencies": {
"@types/node": "18.7.15",
"@types/node": "18.11.6",
"@types/node-os-utils": "1.3.0",
"@typescript-eslint/eslint-plugin": "5.36.2",
"@typescript-eslint/parser": "5.36.2",
"esbuild": "0.15.7",
"eslint": "8.23.0",
"@typescript-eslint/eslint-plugin": "5.41.0",
"@typescript-eslint/parser": "5.41.0",
"esbuild": "0.15.12",
"eslint": "8.26.0",
"eslint-config-prettier": "8.5.0",
"eslint-plugin-prettier": "4.2.1",
"nodemon": "2.0.19",
"nodemon": "2.0.20",
"prettier": "2.7.1",
"prisma": "4.3.1",
"rimraf": "3.0.2",
"tsconfig-paths": "4.1.0",
"typescript": "4.8.2"
"types-fastify-socket.io": "0.0.1",
"typescript": "4.8.4"
},
"prisma": {
"seed": "node prisma/seed.js"

View File

@@ -0,0 +1,18 @@
-- AlterTable
ALTER TABLE "Build" ADD COLUMN "previewApplicationId" TEXT;
-- CreateTable
CREATE TABLE "PreviewApplication" (
"id" TEXT NOT NULL PRIMARY KEY,
"pullmergeRequestId" TEXT NOT NULL,
"sourceBranch" TEXT NOT NULL,
"isRandomDomain" BOOLEAN NOT NULL DEFAULT false,
"customDomain" TEXT,
"applicationId" TEXT NOT NULL,
"createdAt" DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updatedAt" DATETIME NOT NULL,
CONSTRAINT "PreviewApplication_applicationId_fkey" FOREIGN KEY ("applicationId") REFERENCES "Application" ("id") ON DELETE RESTRICT ON UPDATE CASCADE
);
-- CreateIndex
CREATE UNIQUE INDEX "PreviewApplication_applicationId_key" ON "PreviewApplication"("applicationId");

View File

@@ -0,0 +1,10 @@
-- CreateTable
CREATE TABLE "Certificate" (
"id" TEXT NOT NULL PRIMARY KEY,
"key" TEXT NOT NULL,
"cert" TEXT NOT NULL,
"createdAt" DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updatedAt" DATETIME NOT NULL,
"teamId" TEXT,
CONSTRAINT "Certificate_teamId_fkey" FOREIGN KEY ("teamId") REFERENCES "Team" ("id") ON DELETE SET NULL ON UPDATE CASCADE
);

View File

@@ -0,0 +1,23 @@
-- RedefineTables
PRAGMA foreign_keys=OFF;
CREATE TABLE "new_ApplicationSettings" (
"id" TEXT NOT NULL PRIMARY KEY,
"applicationId" TEXT NOT NULL,
"dualCerts" BOOLEAN NOT NULL DEFAULT false,
"debug" BOOLEAN NOT NULL DEFAULT false,
"previews" BOOLEAN NOT NULL DEFAULT false,
"autodeploy" BOOLEAN NOT NULL DEFAULT true,
"isBot" BOOLEAN NOT NULL DEFAULT false,
"isPublicRepository" BOOLEAN NOT NULL DEFAULT false,
"isDBBranching" BOOLEAN NOT NULL DEFAULT false,
"isCustomSSL" BOOLEAN NOT NULL DEFAULT false,
"createdAt" DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updatedAt" DATETIME NOT NULL,
CONSTRAINT "ApplicationSettings_applicationId_fkey" FOREIGN KEY ("applicationId") REFERENCES "Application" ("id") ON DELETE RESTRICT ON UPDATE CASCADE
);
INSERT INTO "new_ApplicationSettings" ("applicationId", "autodeploy", "createdAt", "debug", "dualCerts", "id", "isBot", "isDBBranching", "isPublicRepository", "previews", "updatedAt") SELECT "applicationId", "autodeploy", "createdAt", "debug", "dualCerts", "id", "isBot", "isDBBranching", "isPublicRepository", "previews", "updatedAt" FROM "ApplicationSettings";
DROP TABLE "ApplicationSettings";
ALTER TABLE "new_ApplicationSettings" RENAME TO "ApplicationSettings";
CREATE UNIQUE INDEX "ApplicationSettings_applicationId_key" ON "ApplicationSettings"("applicationId");
PRAGMA foreign_key_check;
PRAGMA foreign_keys=ON;

View File

@@ -0,0 +1,26 @@
-- RedefineTables
PRAGMA foreign_keys=OFF;
CREATE TABLE "new_GitSource" (
"id" TEXT NOT NULL PRIMARY KEY,
"name" TEXT NOT NULL,
"forPublic" BOOLEAN NOT NULL DEFAULT false,
"type" TEXT,
"apiUrl" TEXT,
"htmlUrl" TEXT,
"customPort" INTEGER NOT NULL DEFAULT 22,
"organization" TEXT,
"createdAt" DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updatedAt" DATETIME NOT NULL,
"githubAppId" TEXT,
"gitlabAppId" TEXT,
"isSystemWide" BOOLEAN NOT NULL DEFAULT false,
CONSTRAINT "GitSource_gitlabAppId_fkey" FOREIGN KEY ("gitlabAppId") REFERENCES "GitlabApp" ("id") ON DELETE SET NULL ON UPDATE CASCADE,
CONSTRAINT "GitSource_githubAppId_fkey" FOREIGN KEY ("githubAppId") REFERENCES "GithubApp" ("id") ON DELETE SET NULL ON UPDATE CASCADE
);
INSERT INTO "new_GitSource" ("apiUrl", "createdAt", "customPort", "forPublic", "githubAppId", "gitlabAppId", "htmlUrl", "id", "name", "organization", "type", "updatedAt") SELECT "apiUrl", "createdAt", "customPort", "forPublic", "githubAppId", "gitlabAppId", "htmlUrl", "id", "name", "organization", "type", "updatedAt" FROM "GitSource";
DROP TABLE "GitSource";
ALTER TABLE "new_GitSource" RENAME TO "GitSource";
CREATE UNIQUE INDEX "GitSource_githubAppId_key" ON "GitSource"("githubAppId");
CREATE UNIQUE INDEX "GitSource_gitlabAppId_key" ON "GitSource"("gitlabAppId");
PRAGMA foreign_key_check;
PRAGMA foreign_keys=ON;

View File

@@ -0,0 +1,2 @@
-- DropIndex
DROP INDEX "PreviewApplication_applicationId_key";

View File

@@ -0,0 +1,2 @@
-- AlterTable
ALTER TABLE "Build" ADD COLUMN "sourceRepository" TEXT;

View File

@@ -0,0 +1,3 @@
-- AlterTable
ALTER TABLE "Application" ADD COLUMN "dockerComposeFile" TEXT;
ALTER TABLE "Application" ADD COLUMN "dockerComposeFileLocation" TEXT;

View File

@@ -0,0 +1,2 @@
-- AlterTable
ALTER TABLE "Application" ADD COLUMN "dockerComposeConfiguration" TEXT;

View File

@@ -0,0 +1,13 @@
-- CreateTable
CREATE TABLE "ServiceSetting" (
"id" TEXT NOT NULL PRIMARY KEY,
"serviceId" TEXT NOT NULL,
"name" TEXT NOT NULL,
"value" TEXT NOT NULL,
"createdAt" DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updatedAt" DATETIME NOT NULL,
CONSTRAINT "ServiceSetting_serviceId_fkey" FOREIGN KEY ("serviceId") REFERENCES "Service" ("id") ON DELETE RESTRICT ON UPDATE CASCADE
);
-- CreateIndex
CREATE UNIQUE INDEX "ServiceSetting_serviceId_name_key" ON "ServiceSetting"("serviceId", "name");

View File

@@ -0,0 +1,19 @@
-- RedefineTables
PRAGMA foreign_keys=OFF;
CREATE TABLE "new_ServicePersistentStorage" (
"id" TEXT NOT NULL PRIMARY KEY,
"serviceId" TEXT NOT NULL,
"path" TEXT NOT NULL,
"volumeName" TEXT,
"predefined" BOOLEAN NOT NULL DEFAULT false,
"containerId" TEXT,
"createdAt" DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updatedAt" DATETIME NOT NULL,
CONSTRAINT "ServicePersistentStorage_serviceId_fkey" FOREIGN KEY ("serviceId") REFERENCES "Service" ("id") ON DELETE RESTRICT ON UPDATE CASCADE
);
INSERT INTO "new_ServicePersistentStorage" ("createdAt", "id", "path", "serviceId", "updatedAt") SELECT "createdAt", "id", "path", "serviceId", "updatedAt" FROM "ServicePersistentStorage";
DROP TABLE "ServicePersistentStorage";
ALTER TABLE "new_ServicePersistentStorage" RENAME TO "ServicePersistentStorage";
CREATE UNIQUE INDEX "ServicePersistentStorage_serviceId_path_key" ON "ServicePersistentStorage"("serviceId", "path");
PRAGMA foreign_key_check;
PRAGMA foreign_keys=ON;

View File

@@ -0,0 +1,24 @@
/*
Warnings:
- Added the required column `variableName` to the `ServiceSetting` table without a default value. This is not possible if the table is not empty.
*/
-- RedefineTables
PRAGMA foreign_keys=OFF;
CREATE TABLE "new_ServiceSetting" (
"id" TEXT NOT NULL PRIMARY KEY,
"serviceId" TEXT NOT NULL,
"name" TEXT NOT NULL,
"value" TEXT NOT NULL,
"variableName" TEXT NOT NULL,
"createdAt" DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updatedAt" DATETIME NOT NULL,
CONSTRAINT "ServiceSetting_serviceId_fkey" FOREIGN KEY ("serviceId") REFERENCES "Service" ("id") ON DELETE RESTRICT ON UPDATE CASCADE
);
INSERT INTO "new_ServiceSetting" ("createdAt", "id", "name", "serviceId", "updatedAt", "value") SELECT "createdAt", "id", "name", "serviceId", "updatedAt", "value" FROM "ServiceSetting";
DROP TABLE "ServiceSetting";
ALTER TABLE "new_ServiceSetting" RENAME TO "ServiceSetting";
CREATE UNIQUE INDEX "ServiceSetting_serviceId_name_key" ON "ServiceSetting"("serviceId", "name");
PRAGMA foreign_key_check;
PRAGMA foreign_keys=ON;

View File

@@ -0,0 +1,21 @@
-- RedefineTables
PRAGMA foreign_keys=OFF;
CREATE TABLE "new_Service" (
"id" TEXT NOT NULL PRIMARY KEY,
"name" TEXT NOT NULL,
"fqdn" TEXT,
"exposePort" INTEGER,
"dualCerts" BOOLEAN NOT NULL DEFAULT false,
"type" TEXT,
"version" TEXT,
"templateVersion" TEXT NOT NULL DEFAULT '0.0.0',
"destinationDockerId" TEXT,
"createdAt" DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updatedAt" DATETIME NOT NULL,
CONSTRAINT "Service_destinationDockerId_fkey" FOREIGN KEY ("destinationDockerId") REFERENCES "DestinationDocker" ("id") ON DELETE SET NULL ON UPDATE CASCADE
);
INSERT INTO "new_Service" ("createdAt", "destinationDockerId", "dualCerts", "exposePort", "fqdn", "id", "name", "type", "updatedAt", "version") SELECT "createdAt", "destinationDockerId", "dualCerts", "exposePort", "fqdn", "id", "name", "type", "updatedAt", "version" FROM "Service";
DROP TABLE "Service";
ALTER TABLE "new_Service" RENAME TO "Service";
PRAGMA foreign_key_check;
PRAGMA foreign_keys=ON;

View File

@@ -0,0 +1,11 @@
/*
Warnings:
- A unique constraint covering the columns `[serviceId,containerId,path]` on the table `ServicePersistentStorage` will be added. If there are existing duplicate values, this will fail.
*/
-- DropIndex
DROP INDEX "ServicePersistentStorage_serviceId_path_key";
-- CreateIndex
CREATE UNIQUE INDEX "ServicePersistentStorage_serviceId_containerId_path_key" ON "ServicePersistentStorage"("serviceId", "containerId", "path");

View File

@@ -0,0 +1,32 @@
-- RedefineTables
PRAGMA foreign_keys=OFF;
CREATE TABLE "new_Wordpress" (
"id" TEXT NOT NULL PRIMARY KEY,
"extraConfig" TEXT,
"tablePrefix" TEXT,
"ownMysql" BOOLEAN NOT NULL DEFAULT false,
"mysqlHost" TEXT,
"mysqlPort" INTEGER,
"mysqlUser" TEXT,
"mysqlPassword" TEXT,
"mysqlRootUser" TEXT,
"mysqlRootUserPassword" TEXT,
"mysqlDatabase" TEXT,
"mysqlPublicPort" INTEGER,
"ftpEnabled" BOOLEAN NOT NULL DEFAULT false,
"ftpUser" TEXT,
"ftpPassword" TEXT,
"ftpPublicPort" INTEGER,
"ftpHostKey" TEXT,
"ftpHostKeyPrivate" TEXT,
"serviceId" TEXT NOT NULL,
"createdAt" DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updatedAt" DATETIME NOT NULL,
CONSTRAINT "Wordpress_serviceId_fkey" FOREIGN KEY ("serviceId") REFERENCES "Service" ("id") ON DELETE RESTRICT ON UPDATE CASCADE
);
INSERT INTO "new_Wordpress" ("createdAt", "extraConfig", "ftpEnabled", "ftpHostKey", "ftpHostKeyPrivate", "ftpPassword", "ftpPublicPort", "ftpUser", "id", "mysqlDatabase", "mysqlHost", "mysqlPassword", "mysqlPort", "mysqlPublicPort", "mysqlRootUser", "mysqlRootUserPassword", "mysqlUser", "ownMysql", "serviceId", "tablePrefix", "updatedAt") SELECT "createdAt", "extraConfig", "ftpEnabled", "ftpHostKey", "ftpHostKeyPrivate", "ftpPassword", "ftpPublicPort", "ftpUser", "id", "mysqlDatabase", "mysqlHost", "mysqlPassword", "mysqlPort", "mysqlPublicPort", "mysqlRootUser", "mysqlRootUserPassword", "mysqlUser", "ownMysql", "serviceId", "tablePrefix", "updatedAt" FROM "Wordpress";
DROP TABLE "Wordpress";
ALTER TABLE "new_Wordpress" RENAME TO "Wordpress";
CREATE UNIQUE INDEX "Wordpress_serviceId_key" ON "Wordpress"("serviceId");
PRAGMA foreign_key_check;
PRAGMA foreign_keys=ON;

View File

@@ -0,0 +1,2 @@
-- AlterTable
ALTER TABLE "Setting" ADD COLUMN "proxyDefaultRedirect" TEXT;

View File

@@ -8,6 +8,16 @@ datasource db {
url = env("COOLIFY_DATABASE_URL")
}
model Certificate {
id String @id @default(cuid())
key String
cert String
team Team? @relation(fields: [teamId], references: [id])
createdAt DateTime @default(now())
updatedAt DateTime @updatedAt
teamId String?
}
model Setting {
id String @id @default(cuid())
fqdn String? @unique
@@ -19,6 +29,7 @@ model Setting {
proxyPassword String
proxyUser String
proxyHash String?
proxyDefaultRedirect String?
isAutoUpdateEnabled Boolean @default(false)
isDNSCheckEnabled Boolean @default(true)
DNSServers String?
@@ -70,6 +81,7 @@ model Team {
gitLabApps GitlabApp[]
service Service[]
users User[]
certificate Certificate[]
}
model TeamInvitation {
@@ -83,42 +95,58 @@ model TeamInvitation {
}
model Application {
id String @id @default(cuid())
name String
fqdn String?
repository String?
configHash String?
branch String?
buildPack String?
projectId Int?
port Int?
exposePort Int?
installCommand String?
buildCommand String?
startCommand String?
baseDirectory String?
publishDirectory String?
deploymentType String?
phpModules String?
pythonWSGI String?
pythonModule String?
pythonVariable String?
dockerFileLocation String?
denoMainFile String?
denoOptions String?
createdAt DateTime @default(now())
updatedAt DateTime @updatedAt
destinationDockerId String?
gitSourceId String?
baseImage String?
baseBuildImage String?
gitSource GitSource? @relation(fields: [gitSourceId], references: [id])
destinationDocker DestinationDocker? @relation(fields: [destinationDockerId], references: [id])
persistentStorage ApplicationPersistentStorage[]
settings ApplicationSettings?
secrets Secret[]
teams Team[]
connectedDatabase ApplicationConnectedDatabase?
id String @id @default(cuid())
name String
fqdn String?
repository String?
configHash String?
branch String?
buildPack String?
projectId Int?
port Int?
exposePort Int?
installCommand String?
buildCommand String?
startCommand String?
baseDirectory String?
publishDirectory String?
deploymentType String?
phpModules String?
pythonWSGI String?
pythonModule String?
pythonVariable String?
dockerFileLocation String?
denoMainFile String?
denoOptions String?
dockerComposeFile String?
dockerComposeFileLocation String?
dockerComposeConfiguration String?
createdAt DateTime @default(now())
updatedAt DateTime @updatedAt
destinationDockerId String?
gitSourceId String?
baseImage String?
baseBuildImage String?
gitSource GitSource? @relation(fields: [gitSourceId], references: [id])
destinationDocker DestinationDocker? @relation(fields: [destinationDockerId], references: [id])
persistentStorage ApplicationPersistentStorage[]
settings ApplicationSettings?
secrets Secret[]
teams Team[]
connectedDatabase ApplicationConnectedDatabase?
previewApplication PreviewApplication[]
}
model PreviewApplication {
id String @id @default(cuid())
pullmergeRequestId String
sourceBranch String
isRandomDomain Boolean @default(false)
customDomain String?
applicationId String
application Application @relation(fields: [applicationId], references: [id])
createdAt DateTime @default(now())
updatedAt DateTime @updatedAt
}
model ApplicationConnectedDatabase {
@@ -148,6 +176,7 @@ model ApplicationSettings {
isBot Boolean @default(false)
isPublicRepository Boolean @default(false)
isDBBranching Boolean @default(false)
isCustomSSL Boolean @default(false)
createdAt DateTime @default(now())
updatedAt DateTime @updatedAt
application Application @relation(fields: [applicationId], references: [id])
@@ -165,14 +194,17 @@ model ApplicationPersistentStorage {
}
model ServicePersistentStorage {
id String @id @default(cuid())
serviceId String
path String
createdAt DateTime @default(now())
updatedAt DateTime @updatedAt
service Service @relation(fields: [serviceId], references: [id])
id String @id @default(cuid())
serviceId String
path String
volumeName String?
predefined Boolean @default(false)
containerId String?
createdAt DateTime @default(now())
updatedAt DateTime @updatedAt
service Service @relation(fields: [serviceId], references: [id])
@@unique([serviceId, path])
@@unique([serviceId, containerId, path])
}
model Secret {
@@ -210,21 +242,23 @@ model BuildLog {
}
model Build {
id String @id @default(cuid())
type String
applicationId String?
destinationDockerId String?
gitSourceId String?
githubAppId String?
gitlabAppId String?
commit String?
pullmergeRequestId String?
forceRebuild Boolean @default(false)
sourceBranch String?
branch String?
status String? @default("queued")
createdAt DateTime @default(now())
updatedAt DateTime @updatedAt
id String @id @default(cuid())
type String
applicationId String?
destinationDockerId String?
gitSourceId String?
githubAppId String?
gitlabAppId String?
commit String?
pullmergeRequestId String?
previewApplicationId String?
forceRebuild Boolean @default(false)
sourceBranch String?
sourceRepository String?
branch String?
status String? @default("queued")
createdAt DateTime @default(now())
updatedAt DateTime @updatedAt
}
model DestinationDocker {
@@ -273,6 +307,7 @@ model GitSource {
updatedAt DateTime @updatedAt
githubAppId String? @unique
gitlabAppId String? @unique
isSystemWide Boolean @default(false)
gitlabApp GitlabApp? @relation(fields: [gitlabAppId], references: [id])
githubApp GithubApp? @relation(fields: [githubAppId], references: [id])
application Application[]
@@ -361,12 +396,14 @@ model Service {
dualCerts Boolean @default(false)
type String?
version String?
templateVersion String @default("0.0.0")
destinationDockerId String?
createdAt DateTime @default(now())
updatedAt DateTime @updatedAt
destinationDocker DestinationDocker? @relation(fields: [destinationDockerId], references: [id])
persistentStorage ServicePersistentStorage[]
serviceSecret ServiceSecret[]
serviceSetting ServiceSetting[]
teams Team[]
fider Fider?
@@ -386,6 +423,19 @@ model Service {
taiga Taiga?
}
model ServiceSetting {
id String @id @default(cuid())
serviceId String
name String
value String
variableName String
createdAt DateTime @default(now())
updatedAt DateTime @updatedAt
service Service @relation(fields: [serviceId], references: [id])
@@unique([serviceId, name])
}
model PlausibleAnalytics {
id String @id @default(cuid())
email String?
@@ -431,10 +481,10 @@ model Wordpress {
ownMysql Boolean @default(false)
mysqlHost String?
mysqlPort Int?
mysqlUser String
mysqlPassword String
mysqlRootUser String
mysqlRootUserPassword String
mysqlUser String?
mysqlPassword String?
mysqlRootUser String?
mysqlRootUserPassword String?
mysqlDatabase String?
mysqlPublicPort Int?
ftpEnabled Boolean @default(false)

View File

@@ -94,6 +94,16 @@ async function main() {
}
});
}
// Set new preview secrets
const secrets = await prisma.secret.findMany({ where: { isPRMRSecret: false } })
if (secrets.length > 0) {
for (const secret of secrets) {
const previewSecrets = await prisma.secret.findMany({ where: { applicationId: secret.applicationId, name: secret.name, isPRMRSecret: true } })
if (previewSecrets.length === 0) {
await prisma.secret.create({ data: { ...secret, id: undefined, isPRMRSecret: true } })
}
}
}
}
main()
.catch((e) => {

View File

@@ -0,0 +1,67 @@
import fs from 'fs/promises';
import yaml from 'js-yaml';
import got from 'got';
const repositories = [];
const templates = await fs.readFile('./apps/api/devTemplates.yaml', 'utf8');
const devTemplates = yaml.load(templates);
for (const template of devTemplates) {
let image = template.services['$$id'].image.replaceAll(':$$core_version', '');
if (!image.includes('/')) {
image = `library/${image}`;
}
repositories.push({ image, name: template.type });
}
const services = []
const numberOfTags = 30;
// const semverRegex = new RegExp(/^v?(0|[1-9]\d*)\.(0|[1-9]\d*)\.(0|[1-9]\d*)(?:-((?:0|[1-9]\d*|\d*[a-zA-Z-][0-9a-zA-Z-]*)(?:\.(?:0|[1-9]\d*|\d*[a-zA-Z-][0-9a-zA-Z-]*))*))?(?:\+([0-9a-zA-Z-]+(?:\.[0-9a-zA-Z-]+)*))?$/g)
for (const repository of repositories) {
console.log('Querying', repository.name, 'at', repository.image);
let semverRegex = new RegExp(/^v?(0|[1-9]\d*)\.(0|[1-9]\d*)\.(0|[1-9]\d*)$/g)
if (repository.name.startsWith('wordpress')) {
semverRegex = new RegExp(/^v?(0|[1-9]\d*)\.(0|[1-9]\d*)\.(0|[1-9]\d*)-php(0|[1-9]\d*)$/g)
}
if (repository.name.startsWith('minio')) {
semverRegex = new RegExp(/^RELEASE.*$/g)
}
if (repository.name.startsWith('fider')) {
semverRegex = new RegExp(/^v?(0|[1-9]\d*)\.(0|[1-9]\d*)\.(0|[1-9]\d*)-([0-9]+)$/g)
}
if (repository.name.startsWith('searxng')) {
semverRegex = new RegExp(/^\d{4}[\.\-](0?[1-9]|[12][0-9]|3[01])[\.\-](0?[1-9]|1[012]).*$/)
}
if (repository.name.startsWith('umami')) {
semverRegex = new RegExp(/^postgresql-v?(0|[1-9]\d*)\.(0|[1-9]\d*)\.(0|[1-9]\d*)-([0-9]+)$/g)
}
if (repository.image.includes('ghcr.io')) {
const { execaCommand } = await import('execa');
const { stdout } = await execaCommand(`docker run --rm quay.io/skopeo/stable list-tags docker://${repository.image}`);
if (stdout) {
const json = JSON.parse(stdout);
const semverTags = json.Tags.filter((tag) => semverRegex.test(tag))
let tags = semverTags.length > 10 ? semverTags.sort().reverse().slice(0, numberOfTags) : json.Tags.sort().reverse().slice(0, numberOfTags)
if (!tags.includes('latest')) {
tags.push('latest')
}
services.push({ name: repository.name, image: repository.image, tags })
}
} else {
const { token } = await got.get(`https://auth.docker.io/token?service=registry.docker.io&scope=repository:${repository.image}:pull`).json()
let data = await got.get(`https://registry-1.docker.io/v2/${repository.image}/tags/list`, {
headers: {
Authorization: `Bearer ${token}`
}
}).json()
const semverTags = data.tags.filter((tag) => semverRegex.test(tag))
let tags = semverTags.length > 10 ? semverTags.sort().reverse().slice(0, numberOfTags) : data.tags.sort().reverse().slice(0, numberOfTags)
if (!tags.includes('latest')) {
tags.push('latest')
}
services.push({
name: repository.name,
image: repository.image,
tags
})
}
}
await fs.writeFile('./apps/api/devTags.json', JSON.stringify(services));

View File

@@ -3,12 +3,23 @@ import cors from '@fastify/cors';
import serve from '@fastify/static';
import env from '@fastify/env';
import cookie from '@fastify/cookie';
import multipart from '@fastify/multipart';
import path, { join } from 'path';
import autoLoad from '@fastify/autoload';
import { asyncExecShell, createRemoteEngineConfiguration, getDomain, isDev, listSettings, prisma, version } from './lib/common';
import socketIO from 'fastify-socket.io'
import socketIOServer from './realtime'
import { asyncExecShell, cleanupDockerStorage, createRemoteEngineConfiguration, decrypt, encrypt, executeDockerCmd, executeSSHCmd, generateDatabaseConfiguration, isDev, listSettings, prisma, startTraefikProxy, startTraefikTCPProxy, version } from './lib/common';
import { scheduler } from './lib/scheduler';
import { compareVersions } from 'compare-versions';
import Graceful from '@ladjs/graceful'
import yaml from 'js-yaml'
import fs from 'fs/promises';
import { verifyRemoteDockerEngineFn } from './routes/api/v1/destinations/handlers';
import { checkContainer } from './lib/docker';
import { migrateServicesToNewTemplate } from './lib';
import { refreshTags, refreshTemplates } from './routes/api/v1/handlers';
declare module 'fastify' {
interface FastifyInstance {
config: {
@@ -26,11 +37,13 @@ declare module 'fastify' {
const port = isDev ? 3001 : 3000;
const host = '0.0.0.0';
prisma.setting.findFirst().then(async (settings) => {
(async () => {
const settings = await prisma.setting.findFirst()
const fastify = Fastify({
logger: settings?.isAPIDebuggingEnabled || false,
trustProxy: true
});
const schema = {
type: 'object',
required: ['COOLIFY_SECRET_KEY', 'COOLIFY_DATABASE_URL', 'COOLIFY_IS_ON'],
@@ -68,7 +81,6 @@ prisma.setting.findFirst().then(async (settings) => {
}
};
const options = {
schema,
dotenv: true
@@ -88,39 +100,49 @@ prisma.setting.findFirst().then(async (settings) => {
return reply.status(200).sendFile('index.html');
});
}
fastify.register(multipart, { limits: { fileSize: 100000 } });
fastify.register(autoLoad, {
dir: join(__dirname, 'plugins')
});
fastify.register(autoLoad, {
dir: join(__dirname, 'routes')
});
fastify.register(cookie)
fastify.register(cors);
fastify.addHook('onRequest', async (request, reply) => {
let allowedList = ['coolify:3000'];
const { ipv4, ipv6, fqdn } = await prisma.setting.findFirst({})
ipv4 && allowedList.push(`${ipv4}:3000`);
ipv6 && allowedList.push(ipv6);
fqdn && allowedList.push(getDomain(fqdn));
isDev && allowedList.push('localhost:3000') && allowedList.push('localhost:3001') && allowedList.push('host.docker.internal:3001');
const remotes = await prisma.destinationDocker.findMany({ where: { remoteEngine: true, remoteVerified: true } })
if (remotes.length > 0) {
remotes.forEach(remote => {
allowedList.push(`${remote.remoteIpAddress}:3000`);
})
}
if (!allowedList.includes(request.headers.host)) {
// console.log('not allowed', request.headers.host)
fastify.register(socketIO, {
cors: {
origin: isDev ? "*" : ''
}
})
fastify.listen({ port, host }, async (err: any, address: any) => {
if (err) {
console.error(err);
process.exit(1);
}
// To detect allowed origins
// fastify.addHook('onRequest', async (request, reply) => {
// console.log(request.headers.host)
// let allowedList = ['coolify:3000'];
// const { ipv4, ipv6, fqdn } = await prisma.setting.findFirst({})
// ipv4 && allowedList.push(`${ipv4}:3000`);
// ipv6 && allowedList.push(ipv6);
// fqdn && allowedList.push(getDomain(fqdn));
// isDev && allowedList.push('localhost:3000') && allowedList.push('localhost:3001') && allowedList.push('host.docker.internal:3001');
// const remotes = await prisma.destinationDocker.findMany({ where: { remoteEngine: true, remoteVerified: true } })
// if (remotes.length > 0) {
// remotes.forEach(remote => {
// allowedList.push(`${remote.remoteIpAddress}:3000`);
// })
// }
// if (!allowedList.includes(request.headers.host)) {
// // console.log('not allowed', request.headers.host)
// }
// })
try {
await fastify.listen({ port, host })
await socketIOServer(fastify)
console.log(`Coolify's API is listening on ${host}:${port}`);
migrateServicesToNewTemplate()
await initServer();
const graceful = new Graceful({ brees: [scheduler] });
@@ -130,57 +152,92 @@ prisma.setting.findFirst().then(async (settings) => {
if (!scheduler.workers.has('deployApplication')) {
scheduler.run('deployApplication');
}
if (!scheduler.workers.has('infrastructure')) {
scheduler.run('infrastructure');
}
}, 2000)
// autoUpdater
setInterval(async () => {
scheduler.workers.has('infrastructure') && scheduler.workers.get('infrastructure').postMessage("action:autoUpdater")
}, isDev ? 5000 : 60000 * 15)
await autoUpdater()
}, 60000 * 15)
// cleanupStorage
setInterval(async () => {
scheduler.workers.has('infrastructure') && scheduler.workers.get('infrastructure').postMessage("action:cleanupStorage")
}, isDev ? 6000 : 60000 * 10)
await cleanupStorage()
}, 60000 * 10)
// checkProxies
// checkProxies, checkFluentBit & refresh templates
setInterval(async () => {
scheduler.workers.has('infrastructure') && scheduler.workers.get('infrastructure').postMessage("action:checkProxies")
await checkProxies();
await checkFluentBit();
}, 60000)
// Refresh and check templates
setInterval(async () => {
await refreshTemplates()
await refreshTags()
await migrateServicesToNewTemplate()
}, 60000)
setInterval(async () => {
await copySSLCertificates();
}, 10000)
// cleanupPrismaEngines
// setInterval(async () => {
// scheduler.workers.has('infrastructure') && scheduler.workers.get('infrastructure').postMessage("action:cleanupPrismaEngines")
// }, 60000)
await Promise.all([
getTagsTemplates(),
getArch(),
getIPAddress(),
configureRemoteDockers(),
])
});
})
} catch (error) {
console.error(error);
process.exit(1);
}
})();
async function getIPAddress() {
const { publicIpv4, publicIpv6 } = await import('public-ip')
try {
const settings = await listSettings();
if (!settings.ipv4) {
console.log(`Getting public IPv4 address...`);
const ipv4 = await publicIpv4({ timeout: 2000 })
await prisma.setting.update({ where: { id: settings.id }, data: { ipv4 } })
}
if (!settings.ipv6) {
console.log(`Getting public IPv6 address...`);
const ipv6 = await publicIpv6({ timeout: 2000 })
await prisma.setting.update({ where: { id: settings.id }, data: { ipv6 } })
}
} catch (error) { }
}
async function getTagsTemplates() {
const { default: got } = await import('got')
try {
if (isDev) {
const templates = await fs.readFile('./devTemplates.yaml', 'utf8')
const tags = await fs.readFile('./devTags.json', 'utf8')
await fs.writeFile('./templates.json', JSON.stringify(yaml.load(templates)))
await fs.writeFile('./tags.json', tags)
console.log('Tags and templates loaded in dev mode...')
} else {
const tags = await got.get('https://get.coollabs.io/coolify/service-tags.json').text()
const response = await got.get('https://get.coollabs.io/coolify/service-templates.yaml').text()
await fs.writeFile('/app/templates.json', JSON.stringify(yaml.load(response)))
await fs.writeFile('/app/tags.json', tags)
console.log('Tags and templates loaded...')
}
} catch (error) {
console.log("Couldn't get latest templates.")
console.log(error)
}
}
async function initServer() {
try {
console.log(`Initializing server...`);
await asyncExecShell(`docker network create --attachable coolify`);
} catch (error) { }
try {
@@ -190,10 +247,12 @@ async function initServer() {
}
} catch (error) { }
}
async function getArch() {
try {
const settings = await prisma.setting.findFirst({})
if (settings && !settings.arch) {
console.log(`Getting architecture...`);
await prisma.setting.update({ where: { id: settings.id }, data: { arch: process.arch } })
}
} catch (error) { }
@@ -205,9 +264,247 @@ async function configureRemoteDockers() {
where: { remoteVerified: true, remoteEngine: true }
});
if (remoteDocker.length > 0) {
console.log(`Verifying Remote Docker Engines...`);
for (const docker of remoteDocker) {
await createRemoteEngineConfiguration(docker.id)
console.log('Verifying:', docker.remoteIpAddress)
await verifyRemoteDockerEngineFn(docker.id);
}
}
} catch (error) { }
} catch (error) {
console.log(error)
}
}
async function autoUpdater() {
try {
const { default: got } = await import('got')
const currentVersion = version;
const { coolify } = await got.get('https://get.coollabs.io/versions.json', {
searchParams: {
appId: process.env['COOLIFY_APP_ID'] || undefined,
version: currentVersion
}
}).json()
const latestVersion = coolify.main.version;
const isUpdateAvailable = compareVersions(latestVersion, currentVersion);
if (isUpdateAvailable === 1) {
const activeCount = 0
if (activeCount === 0) {
if (!isDev) {
const { isAutoUpdateEnabled } = await prisma.setting.findFirst();
if (isAutoUpdateEnabled) {
await asyncExecShell(`docker pull coollabsio/coolify:${latestVersion}`);
await asyncExecShell(`env | grep '^COOLIFY' > .env`);
await asyncExecShell(
`sed -i '/COOLIFY_AUTO_UPDATE=/cCOOLIFY_AUTO_UPDATE=${isAutoUpdateEnabled}' .env`
);
await asyncExecShell(
`docker run --rm -tid --env-file .env -v /var/run/docker.sock:/var/run/docker.sock -v coolify-db coollabsio/coolify:${latestVersion} /bin/sh -c "env | grep COOLIFY > .env && echo 'TAG=${latestVersion}' >> .env && docker stop -t 0 coolify coolify-fluentbit && docker rm coolify coolify-fluentbit && docker compose pull && docker compose up -d --force-recreate"`
);
}
} else {
console.log('Updating (not really in dev mode).');
}
}
}
} catch (error) {
console.log(error)
}
}
async function checkFluentBit() {
try {
if (!isDev) {
const engine = '/var/run/docker.sock';
const { id } = await prisma.destinationDocker.findFirst({
where: { engine, network: 'coolify' }
});
const { found } = await checkContainer({ dockerId: id, container: 'coolify-fluentbit', remove: true });
if (!found) {
await asyncExecShell(`env | grep '^COOLIFY' > .env`);
await asyncExecShell(`docker compose up -d fluent-bit`);
}
}
} catch (error) {
console.log(error)
}
}
async function checkProxies() {
try {
const { default: isReachable } = await import('is-port-reachable');
let portReachable;
const { arch, ipv4, ipv6 } = await listSettings();
// Coolify Proxy local
const engine = '/var/run/docker.sock';
const localDocker = await prisma.destinationDocker.findFirst({
where: { engine, network: 'coolify', isCoolifyProxyUsed: true }
});
if (localDocker) {
portReachable = await isReachable(80, { host: ipv4 || ipv6 })
if (!portReachable) {
await startTraefikProxy(localDocker.id);
}
}
// Coolify Proxy remote
const remoteDocker = await prisma.destinationDocker.findMany({
where: { remoteEngine: true, remoteVerified: true }
});
if (remoteDocker.length > 0) {
for (const docker of remoteDocker) {
if (docker.isCoolifyProxyUsed) {
portReachable = await isReachable(80, { host: docker.remoteIpAddress })
if (!portReachable) {
await startTraefikProxy(docker.id);
}
}
try {
await createRemoteEngineConfiguration(docker.id)
} catch (error) { }
}
}
// TCP Proxies
const databasesWithPublicPort = await prisma.database.findMany({
where: { publicPort: { not: null } },
include: { settings: true, destinationDocker: true }
});
for (const database of databasesWithPublicPort) {
const { destinationDockerId, destinationDocker, publicPort, id } = database;
if (destinationDockerId && destinationDocker.isCoolifyProxyUsed) {
const { privatePort } = generateDatabaseConfiguration(database, arch);
await startTraefikTCPProxy(destinationDocker, id, publicPort, privatePort);
}
}
const wordpressWithFtp = await prisma.wordpress.findMany({
where: { ftpPublicPort: { not: null } },
include: { service: { include: { destinationDocker: true } } }
});
for (const ftp of wordpressWithFtp) {
const { service, ftpPublicPort } = ftp;
const { destinationDockerId, destinationDocker, id } = service;
if (destinationDockerId && destinationDocker.isCoolifyProxyUsed) {
await startTraefikTCPProxy(destinationDocker, id, ftpPublicPort, 22, 'wordpressftp');
}
}
// HTTP Proxies
// const minioInstances = await prisma.minio.findMany({
// where: { publicPort: { not: null } },
// include: { service: { include: { destinationDocker: true } } }
// });
// for (const minio of minioInstances) {
// const { service, publicPort } = minio;
// const { destinationDockerId, destinationDocker, id } = service;
// if (destinationDockerId && destinationDocker.isCoolifyProxyUsed) {
// await startTraefikTCPProxy(destinationDocker, id, publicPort, 9000);
// }
// }
} catch (error) {
}
}
async function copySSLCertificates() {
try {
const pAll = await import('p-all');
const actions = []
const certificates = await prisma.certificate.findMany({ include: { team: true } })
const teamIds = certificates.map(c => c.teamId)
const destinations = await prisma.destinationDocker.findMany({ where: { isCoolifyProxyUsed: true, teams: { some: { id: { in: [...teamIds] } } } } })
for (const certificate of certificates) {
const { id, key, cert } = certificate
const decryptedKey = decrypt(key)
await fs.writeFile(`/tmp/${id}-key.pem`, decryptedKey)
await fs.writeFile(`/tmp/${id}-cert.pem`, cert)
for (const destination of destinations) {
if (destination.remoteEngine) {
if (destination.remoteVerified) {
const { id: dockerId, remoteIpAddress } = destination
actions.push(async () => copyRemoteCertificates(id, dockerId, remoteIpAddress))
}
} else {
actions.push(async () => copyLocalCertificates(id))
}
}
}
await pAll.default(actions, { concurrency: 1 })
} catch (error) {
console.log(error)
} finally {
await asyncExecShell(`find /tmp/ -maxdepth 1 -type f -name '*-*.pem' -delete`)
}
}
async function copyRemoteCertificates(id: string, dockerId: string, remoteIpAddress: string) {
try {
await asyncExecShell(`scp /tmp/${id}-cert.pem /tmp/${id}-key.pem ${remoteIpAddress}:/tmp/`)
await executeSSHCmd({ dockerId, command: `docker exec coolify-proxy sh -c 'test -d /etc/traefik/acme/custom/ || mkdir -p /etc/traefik/acme/custom/'` })
await executeSSHCmd({ dockerId, command: `docker cp /tmp/${id}-key.pem coolify-proxy:/etc/traefik/acme/custom/` })
await executeSSHCmd({ dockerId, command: `docker cp /tmp/${id}-cert.pem coolify-proxy:/etc/traefik/acme/custom/` })
} catch (error) {
console.log({ error })
}
}
async function copyLocalCertificates(id: string) {
try {
await asyncExecShell(`docker exec coolify-proxy sh -c 'test -d /etc/traefik/acme/custom/ || mkdir -p /etc/traefik/acme/custom/'`)
await asyncExecShell(`docker cp /tmp/${id}-key.pem coolify-proxy:/etc/traefik/acme/custom/`)
await asyncExecShell(`docker cp /tmp/${id}-cert.pem coolify-proxy:/etc/traefik/acme/custom/`)
} catch (error) {
console.log({ error })
}
}
async function cleanupStorage() {
const destinationDockers = await prisma.destinationDocker.findMany();
let enginesDone = new Set()
for (const destination of destinationDockers) {
if (enginesDone.has(destination.engine) || enginesDone.has(destination.remoteIpAddress)) return
if (destination.engine) enginesDone.add(destination.engine)
if (destination.remoteIpAddress) enginesDone.add(destination.remoteIpAddress)
let lowDiskSpace = false;
try {
let stdout = null
if (!isDev) {
const output = await executeDockerCmd({ dockerId: destination.id, command: `CONTAINER=$(docker ps -lq | head -1) && docker exec $CONTAINER sh -c 'df -kPT /'` })
stdout = output.stdout;
} else {
const output = await asyncExecShell(
`df -kPT /`
);
stdout = output.stdout;
}
let lines = stdout.trim().split('\n');
let header = lines[0];
let regex =
/^Filesystem\s+|Type\s+|1024-blocks|\s+Used|\s+Available|\s+Capacity|\s+Mounted on\s*$/g;
const boundaries = [];
let match;
while ((match = regex.exec(header))) {
boundaries.push(match[0].length);
}
boundaries[boundaries.length - 1] = -1;
const data = lines.slice(1).map((line) => {
const cl = boundaries.map((boundary) => {
const column = boundary > 0 ? line.slice(0, boundary) : line;
line = line.slice(boundary);
return column.trim();
});
return {
capacity: Number.parseInt(cl[5], 10) / 100
};
});
if (data.length > 0) {
const { capacity } = data[0];
if (capacity > 0.8) {
lowDiskSpace = true;
}
}
} catch (error) { }
await cleanupDockerStorage(destination.id, lowDiskSpace, false)
}
}

View File

@@ -38,15 +38,22 @@ import * as buildpacks from '../lib/buildPacks';
for (const queueBuild of queuedBuilds) {
actions.push(async () => {
let application = await prisma.application.findUnique({ where: { id: queueBuild.applicationId }, include: { destinationDocker: true, gitSource: { include: { githubApp: true, gitlabApp: true } }, persistentStorage: true, secrets: true, settings: true, teams: true } })
let { id: buildId, type, sourceBranch = null, pullmergeRequestId = null, forceRebuild } = queueBuild
let { id: buildId, type, sourceBranch = null, pullmergeRequestId = null, previewApplicationId = null, forceRebuild, sourceRepository = null } = queueBuild
application = decryptApplication(application)
const originalApplicationId = application.id
if (pullmergeRequestId) {
const previewApplications = await prisma.previewApplication.findMany({ where: { applicationId: originalApplicationId, pullmergeRequestId } })
if (previewApplications.length > 0) {
previewApplicationId = previewApplications[0].id
}
}
const usableApplicationId = previewApplicationId || originalApplicationId
try {
if (queueBuild.status === 'running') {
await saveBuildLog({ line: 'Building halted, restarting...', buildId, applicationId: application.id });
}
const {
id: applicationId,
repository,
name,
destinationDocker,
destinationDockerId,
@@ -69,6 +76,7 @@ import * as buildpacks from '../lib/buildPacks';
} = application
let {
branch,
repository,
buildPack,
port,
installCommand,
@@ -77,6 +85,7 @@ import * as buildpacks from '../lib/buildPacks';
baseDirectory,
publishDirectory,
dockerFileLocation,
dockerComposeConfiguration,
denoMainFile
} = application
const currentHash = crypto
@@ -104,17 +113,6 @@ import * as buildpacks from '../lib/buildPacks';
)
.digest('hex');
const { debug } = settings;
if (concurrency === 1) {
await prisma.build.updateMany({
where: {
status: { in: ['queued', 'running'] },
id: { not: buildId },
applicationId,
createdAt: { lt: new Date(new Date().getTime() - 10 * 1000) }
},
data: { status: 'failed' }
});
}
let imageId = applicationId;
let domain = getDomain(fqdn);
const volumes =
@@ -127,8 +125,12 @@ import * as buildpacks from '../lib/buildPacks';
branch = sourceBranch;
domain = `${pullmergeRequestId}.${domain}`;
imageId = `${applicationId}-${pullmergeRequestId}`;
repository = sourceRepository || repository;
}
try {
dockerComposeConfiguration = JSON.parse(dockerComposeConfiguration)
} catch (error) { }
let deployNeeded = true;
let destinationType;
@@ -146,7 +148,7 @@ import * as buildpacks from '../lib/buildPacks';
startCommand = configuration.startCommand;
buildCommand = configuration.buildCommand;
publishDirectory = configuration.publishDirectory;
baseDirectory = configuration.baseDirectory;
baseDirectory = configuration.baseDirectory || '';
dockerFileLocation = configuration.dockerFileLocation;
denoMainFile = configuration.denoMainFile;
const commit = await importers[gitSource.type]({
@@ -203,18 +205,37 @@ import * as buildpacks from '../lib/buildPacks';
//
}
await copyBaseConfigurationFiles(buildPack, workdir, buildId, applicationId, baseImage);
const labels = makeLabelForStandaloneApplication({
applicationId,
fqdn,
name,
type,
pullmergeRequestId,
buildPack,
repository,
branch,
projectId,
port: exposePort ? `${exposePort}:${port}` : port,
commit,
installCommand,
buildCommand,
startCommand,
baseDirectory,
publishDirectory
});
if (forceRebuild) deployNeeded = true
if (!imageFound || deployNeeded) {
// if (true) {
if (buildpacks[buildPack])
await buildpacks[buildPack]({
dockerId: destinationDocker.id,
network: destinationDocker.network,
buildId,
applicationId,
domain,
name,
type,
volumes,
labels,
pullmergeRequestId,
buildPack,
repository,
@@ -236,11 +257,12 @@ import * as buildpacks from '../lib/buildPacks';
pythonModule,
pythonVariable,
dockerFileLocation,
dockerComposeConfiguration,
denoMainFile,
denoOptions,
baseImage,
baseBuildImage,
deploymentType
deploymentType,
});
else {
await saveBuildLog({ line: `Build pack ${buildPack} not found`, buildId, applicationId });
@@ -249,112 +271,152 @@ import * as buildpacks from '../lib/buildPacks';
} else {
await saveBuildLog({ line: 'Build image already available - no rebuild required.', buildId, applicationId });
}
try {
await executeDockerCmd({ dockerId: destinationDocker.id, command: `docker stop -t 0 ${imageId}` })
await executeDockerCmd({ dockerId: destinationDocker.id, command: `docker rm ${imageId}` })
} catch (error) {
//
}
const envs = [
`PORT=${port}`
];
if (secrets.length > 0) {
secrets.forEach((secret) => {
if (pullmergeRequestId) {
if (secret.isPRMRSecret) {
envs.push(`${secret.name}=${secret.value}`);
}
} else {
if (!secret.isPRMRSecret) {
envs.push(`${secret.name}=${secret.value}`);
}
if (buildPack === 'compose') {
try {
await executeDockerCmd({
dockerId: destinationDockerId,
command: `docker ps -a --filter 'label=coolify.applicationId=${applicationId}' --format {{.ID}}|xargs -r -n 1 docker stop -t 0`
})
await executeDockerCmd({
dockerId: destinationDockerId,
command: `docker ps -a --filter 'label=coolify.applicationId=${applicationId}' --format {{.ID}}|xargs -r -n 1 docker rm --force`
})
} catch (error) {
//
}
try {
await executeDockerCmd({ debug, buildId, applicationId, dockerId: destinationDocker.id, command: `docker compose --project-directory ${workdir} up -d` })
await saveBuildLog({ line: 'Deployment successful!', buildId, applicationId });
await saveBuildLog({ line: 'Proxy will be updated shortly.', buildId, applicationId });
await prisma.build.update({ where: { id: buildId }, data: { status: 'success' } });
await prisma.application.update({
where: { id: applicationId },
data: { configHash: currentHash }
});
} catch (error) {
await saveBuildLog({ line: error, buildId, applicationId });
const foundBuild = await prisma.build.findUnique({ where: { id: buildId } })
if (foundBuild) {
await prisma.build.update({
where: { id: buildId },
data: {
status: 'failed'
}
});
}
});
}
await fs.writeFile(`${workdir}/.env`, envs.join('\n'));
const labels = makeLabelForStandaloneApplication({
applicationId,
fqdn,
name,
type,
pullmergeRequestId,
buildPack,
repository,
branch,
projectId,
port: exposePort ? `${exposePort}:${port}` : port,
commit,
installCommand,
buildCommand,
startCommand,
baseDirectory,
publishDirectory
});
let envFound = false;
try {
envFound = !!(await fs.stat(`${workdir}/.env`));
} catch (error) {
//
}
try {
await saveBuildLog({ line: 'Deployment started.', buildId, applicationId });
const composeVolumes = volumes.map((volume) => {
return {
[`${volume.split(':')[0]}`]: {
name: volume.split(':')[0]
throw new Error(error);
}
} else {
try {
await executeDockerCmd({
dockerId: destinationDockerId,
command: `docker ps -a --filter 'label=com.docker.compose.service=${pullmergeRequestId ? imageId : applicationId}' --format {{.ID}}|xargs -r -n 1 docker stop -t 0`
})
await executeDockerCmd({
dockerId: destinationDockerId,
command: `docker ps -a --filter 'label=com.docker.compose.service=${pullmergeRequestId ? imageId : applicationId}' --format {{.ID}}|xargs -r -n 1 docker rm --force`
})
} catch (error) {
//
}
const envs = [
`PORT=${port}`
];
if (secrets.length > 0) {
secrets.forEach((secret) => {
if (pullmergeRequestId) {
const isSecretFound = secrets.filter(s => s.name === secret.name && s.isPRMRSecret)
if (isSecretFound.length > 0) {
envs.push(`${secret.name}=${isSecretFound[0].value}`);
} else {
envs.push(`${secret.name}=${secret.value}`);
}
} else {
if (!secret.isPRMRSecret) {
envs.push(`${secret.name}=${secret.value}`);
}
}
});
}
await fs.writeFile(`${workdir}/.env`, envs.join('\n'));
let envFound = false;
try {
envFound = !!(await fs.stat(`${workdir}/.env`));
} catch (error) {
//
}
try {
await saveBuildLog({ line: 'Deployment started.', buildId, applicationId });
const composeVolumes = volumes.map((volume) => {
return {
[`${volume.split(':')[0]}`]: {
name: volume.split(':')[0]
}
};
});
const composeFile = {
version: '3.8',
services: {
[imageId]: {
image: `${applicationId}:${tag}`,
container_name: imageId,
volumes,
env_file: envFound ? [`${workdir}/.env`] : [],
labels,
depends_on: [],
expose: [port],
...(exposePort ? { ports: [`${exposePort}:${port}`] } : {}),
...defaultComposeConfiguration(destinationDocker.network),
}
},
networks: {
[destinationDocker.network]: {
external: true
}
},
volumes: Object.assign({}, ...composeVolumes)
};
await fs.writeFile(`${workdir}/docker-compose.yml`, yaml.dump(composeFile));
await executeDockerCmd({ dockerId: destinationDocker.id, command: `docker compose --project-directory ${workdir} up -d` })
await saveBuildLog({ line: 'Deployment successful!', buildId, applicationId });
} catch (error) {
await saveBuildLog({ line: error, buildId, applicationId });
const foundBuild = await prisma.build.findUnique({ where: { id: buildId } })
if (foundBuild) {
await prisma.build.update({
where: { id: buildId },
data: {
status: 'failed'
}
});
}
throw new Error(error);
}
await saveBuildLog({ line: 'Proxy will be updated shortly.', buildId, applicationId });
await prisma.build.update({ where: { id: buildId }, data: { status: 'success' } });
if (!pullmergeRequestId) await prisma.application.update({
where: { id: applicationId },
data: { configHash: currentHash }
});
const composeFile = {
version: '3.8',
services: {
[imageId]: {
image: `${applicationId}:${tag}`,
container_name: imageId,
volumes,
env_file: envFound ? [`${workdir}/.env`] : [],
labels,
depends_on: [],
expose: [port],
...(exposePort ? { ports: [`${exposePort}:${port}`] } : {}),
// logging: {
// driver: 'fluentd',
// },
...defaultComposeConfiguration(destinationDocker.network),
}
},
networks: {
[destinationDocker.network]: {
external: true
}
},
volumes: Object.assign({}, ...composeVolumes)
};
await fs.writeFile(`${workdir}/docker-compose.yml`, yaml.dump(composeFile));
await executeDockerCmd({ dockerId: destinationDocker.id, command: `docker compose --project-directory ${workdir} up -d` })
await saveBuildLog({ line: 'Deployment successful!', buildId, applicationId });
} catch (error) {
await saveBuildLog({ line: error, buildId, applicationId });
await prisma.build.updateMany({
where: { id: buildId, status: { in: ['queued', 'running'] } },
data: { status: 'failed' }
});
throw new Error(error);
}
await saveBuildLog({ line: 'Proxy will be updated shortly.', buildId, applicationId });
await prisma.build.update({ where: { id: buildId }, data: { status: 'success' } });
if (!pullmergeRequestId) await prisma.application.update({
where: { id: applicationId },
data: { configHash: currentHash }
});
}
}
catch (error) {
await prisma.build.updateMany({
where: { id: buildId, status: { in: ['queued', 'running'] } },
data: { status: 'failed' }
});
await saveBuildLog({ line: error, buildId, applicationId: application.id });
const foundBuild = await prisma.build.findUnique({ where: { id: buildId } })
if (foundBuild) {
await prisma.build.update({
where: { id: buildId },
data: {
status: 'failed'
}
});
}
if (error !== 1) {
await saveBuildLog({ line: error, buildId, applicationId: application.id });
}
}
});
}

View File

@@ -1,229 +0,0 @@
import { parentPort } from 'node:worker_threads';
import axios from 'axios';
import { compareVersions } from 'compare-versions';
import { asyncExecShell, cleanupDockerStorage, executeDockerCmd, isDev, prisma, startTraefikTCPProxy, generateDatabaseConfiguration, startTraefikProxy, listSettings, version, createRemoteEngineConfiguration } from '../lib/common';
async function autoUpdater() {
try {
const currentVersion = version;
const { data: versions } = await axios
.get(
`https://get.coollabs.io/versions.json`
, {
params: {
appId: process.env['COOLIFY_APP_ID'] || undefined,
version: currentVersion
}
})
const latestVersion = versions['coolify'].main.version;
const isUpdateAvailable = compareVersions(latestVersion, currentVersion);
if (isUpdateAvailable === 1) {
const activeCount = 0
if (activeCount === 0) {
if (!isDev) {
const { isAutoUpdateEnabled } = await prisma.setting.findFirst();
if (isAutoUpdateEnabled) {
await asyncExecShell(`docker pull coollabsio/coolify:${latestVersion}`);
await asyncExecShell(`env | grep COOLIFY > .env`);
await asyncExecShell(
`sed -i '/COOLIFY_AUTO_UPDATE=/cCOOLIFY_AUTO_UPDATE=${isAutoUpdateEnabled}' .env`
);
await asyncExecShell(
`docker run --rm -tid --env-file .env -v /var/run/docker.sock:/var/run/docker.sock -v coolify-db coollabsio/coolify:${latestVersion} /bin/sh -c "env | grep COOLIFY > .env && echo 'TAG=${latestVersion}' >> .env && docker stop -t 0 coolify && docker rm coolify && docker compose up -d --force-recreate"`
);
}
} else {
console.log('Updating (not really in dev mode).');
}
}
}
} catch (error) { }
}
async function checkProxies() {
try {
const { default: isReachable } = await import('is-port-reachable');
let portReachable;
const { arch, ipv4, ipv6 } = await listSettings();
// Coolify Proxy local
const engine = '/var/run/docker.sock';
const localDocker = await prisma.destinationDocker.findFirst({
where: { engine, network: 'coolify', isCoolifyProxyUsed: true }
});
if (localDocker) {
portReachable = await isReachable(80, { host: ipv4 || ipv6 })
if (!portReachable) {
await startTraefikProxy(localDocker.id);
}
}
// Coolify Proxy remote
const remoteDocker = await prisma.destinationDocker.findMany({
where: { remoteEngine: true, remoteVerified: true }
});
if (remoteDocker.length > 0) {
for (const docker of remoteDocker) {
if (docker.isCoolifyProxyUsed) {
portReachable = await isReachable(80, { host: docker.remoteIpAddress })
if (!portReachable) {
await startTraefikProxy(docker.id);
}
}
try {
await createRemoteEngineConfiguration(docker.id)
} catch (error) { }
}
}
// TCP Proxies
const databasesWithPublicPort = await prisma.database.findMany({
where: { publicPort: { not: null } },
include: { settings: true, destinationDocker: true }
});
for (const database of databasesWithPublicPort) {
const { destinationDockerId, destinationDocker, publicPort, id } = database;
if (destinationDockerId && destinationDocker.isCoolifyProxyUsed) {
const { privatePort } = generateDatabaseConfiguration(database, arch);
portReachable = await isReachable(publicPort, { host: destinationDocker.remoteIpAddress || ipv4 || ipv6 })
if (!portReachable) {
await startTraefikTCPProxy(destinationDocker, id, publicPort, privatePort);
}
}
}
const wordpressWithFtp = await prisma.wordpress.findMany({
where: { ftpPublicPort: { not: null } },
include: { service: { include: { destinationDocker: true } } }
});
for (const ftp of wordpressWithFtp) {
const { service, ftpPublicPort } = ftp;
const { destinationDockerId, destinationDocker, id } = service;
if (destinationDockerId && destinationDocker.isCoolifyProxyUsed) {
portReachable = await isReachable(ftpPublicPort, { host: destinationDocker.remoteIpAddress || ipv4 || ipv6 })
if (!portReachable) {
await startTraefikTCPProxy(destinationDocker, id, ftpPublicPort, 22, 'wordpressftp');
}
}
}
// HTTP Proxies
const minioInstances = await prisma.minio.findMany({
where: { publicPort: { not: null } },
include: { service: { include: { destinationDocker: true } } }
});
for (const minio of minioInstances) {
const { service, publicPort } = minio;
const { destinationDockerId, destinationDocker, id } = service;
if (destinationDockerId && destinationDocker.isCoolifyProxyUsed) {
portReachable = await isReachable(publicPort, { host: destinationDocker.remoteIpAddress || ipv4 || ipv6 })
if (!portReachable) {
await startTraefikTCPProxy(destinationDocker, id, publicPort, 9000);
}
}
}
} catch (error) {
}
}
async function cleanupPrismaEngines() {
if (!isDev) {
try {
const { stdout } = await asyncExecShell(`ps -ef | grep /app/prisma-engines/query-engine | grep -v grep | wc -l | xargs`)
if (stdout.trim() != null && stdout.trim() != '' && Number(stdout.trim()) > 1) {
await asyncExecShell(`killall -q -e /app/prisma-engines/query-engine -o 1m`)
}
} catch (error) { }
}
}
async function cleanupStorage() {
const destinationDockers = await prisma.destinationDocker.findMany();
let enginesDone = new Set()
for (const destination of destinationDockers) {
if (enginesDone.has(destination.engine) || enginesDone.has(destination.remoteIpAddress)) return
if (destination.engine) enginesDone.add(destination.engine)
if (destination.remoteIpAddress) enginesDone.add(destination.remoteIpAddress)
let lowDiskSpace = false;
try {
let stdout = null
if (!isDev) {
const output = await executeDockerCmd({ dockerId: destination.id, command: `CONTAINER=$(docker ps -lq | head -1) && docker exec $CONTAINER sh -c 'df -kPT /'` })
stdout = output.stdout;
} else {
const output = await asyncExecShell(
`df -kPT /`
);
stdout = output.stdout;
}
let lines = stdout.trim().split('\n');
let header = lines[0];
let regex =
/^Filesystem\s+|Type\s+|1024-blocks|\s+Used|\s+Available|\s+Capacity|\s+Mounted on\s*$/g;
const boundaries = [];
let match;
while ((match = regex.exec(header))) {
boundaries.push(match[0].length);
}
boundaries[boundaries.length - 1] = -1;
const data = lines.slice(1).map((line) => {
const cl = boundaries.map((boundary) => {
const column = boundary > 0 ? line.slice(0, boundary) : line;
line = line.slice(boundary);
return column.trim();
});
return {
capacity: Number.parseInt(cl[5], 10) / 100
};
});
if (data.length > 0) {
const { capacity } = data[0];
if (capacity > 0.8) {
lowDiskSpace = true;
}
}
} catch (error) { }
await cleanupDockerStorage(destination.id, lowDiskSpace, false)
}
}
(async () => {
let status = {
cleanupStorage: false,
autoUpdater: false
}
if (parentPort) {
parentPort.on('message', async (message) => {
if (parentPort) {
if (message === 'error') throw new Error('oops');
if (message === 'cancel') {
parentPort.postMessage('cancelled');
process.exit(1);
}
if (message === 'action:cleanupStorage') {
if (!status.autoUpdater) {
status.cleanupStorage = true
await cleanupStorage();
status.cleanupStorage = false
}
return;
}
if (message === 'action:cleanupPrismaEngines') {
await cleanupPrismaEngines();
return;
}
if (message === 'action:checkProxies') {
await checkProxies();
return;
}
if (message === 'action:autoUpdater') {
if (!status.cleanupStorage) {
status.autoUpdater = true
await autoUpdater();
status.autoUpdater = false
}
return;
}
}
});
} else process.exit(0);
})();

486
apps/api/src/lib.ts Normal file
View File

@@ -0,0 +1,486 @@
import cuid from "cuid";
import { decrypt, encrypt, fixType, generatePassword, getDomain, prisma } from "./lib/common";
import { getTemplates } from "./lib/services";
export async function migrateServicesToNewTemplate() {
// This function migrates old hardcoded services to the new template based services
try {
let templates = await getTemplates()
const services: any = await prisma.service.findMany({
include: {
destinationDocker: true,
persistentStorage: true,
serviceSecret: true,
serviceSetting: true,
minio: true,
plausibleAnalytics: true,
vscodeserver: true,
wordpress: true,
ghost: true,
meiliSearch: true,
umami: true,
hasura: true,
fider: true,
moodle: true,
appwrite: true,
glitchTip: true,
searxng: true,
weblate: true,
taiga: true,
}
})
for (const service of services) {
const { id } = service
if (!service.type) {
continue;
}
let template = templates.find(t => fixType(t.type) === fixType(service.type));
if (template) {
template = JSON.parse(JSON.stringify(template).replaceAll('$$id', service.id))
if (service.type === 'plausibleanalytics' && service.plausibleAnalytics) await plausibleAnalytics(service, template)
if (service.type === 'fider' && service.fider) await fider(service, template)
if (service.type === 'minio' && service.minio) await minio(service, template)
if (service.type === 'vscodeserver' && service.vscodeserver) await vscodeserver(service, template)
if (service.type === 'wordpress' && service.wordpress) await wordpress(service, template)
if (service.type === 'ghost' && service.ghost) await ghost(service, template)
if (service.type === 'meilisearch' && service.meiliSearch) await meilisearch(service, template)
if (service.type === 'umami' && service.umami) await umami(service, template)
if (service.type === 'hasura' && service.hasura) await hasura(service, template)
if (service.type === 'glitchTip' && service.glitchTip) await glitchtip(service, template)
if (service.type === 'searxng' && service.searxng) await searxng(service, template)
if (service.type === 'weblate' && service.weblate) await weblate(service, template)
if (service.type === 'appwrite' && service.appwrite) await appwrite(service, template)
await createVolumes(service, template);
if (template.variables.length > 0) {
for (const variable of template.variables) {
const { defaultValue } = variable;
const regex = /^\$\$.*\((\d+)\)$/g;
const length = Number(regex.exec(defaultValue)?.[1]) || undefined
if (variable.defaultValue.startsWith('$$generate_password')) {
variable.value = generatePassword({ length });
} else if (variable.defaultValue.startsWith('$$generate_hex')) {
variable.value = generatePassword({ length, isHex: true });
} else if (variable.defaultValue.startsWith('$$generate_username')) {
variable.value = cuid();
} else {
variable.value = variable.defaultValue || '';
}
}
}
for (const variable of template.variables) {
if (variable.id.startsWith('$$secret_')) {
const found = await prisma.serviceSecret.findFirst({ where: { name: variable.name, serviceId: id } })
if (!found) {
await prisma.serviceSecret.create({
data: { name: variable.name, value: encrypt(variable.value) || '', service: { connect: { id } } }
})
}
}
if (variable.id.startsWith('$$config_')) {
const found = await prisma.serviceSetting.findFirst({ where: { name: variable.name, serviceId: id } })
if (!found) {
await prisma.serviceSetting.create({
data: { name: variable.name, value: variable.value.toString(), variableName: variable.id, service: { connect: { id } } }
})
}
}
}
for (const s of Object.keys(template.services)) {
if (service.type === 'plausibleanalytics') {
continue;
}
if (template.services[s].volumes) {
for (const volume of template.services[s].volumes) {
const [volumeName, path] = volume.split(':')
if (!volumeName.startsWith('/')) {
const found = await prisma.servicePersistentStorage.findFirst({ where: { volumeName, serviceId: id } })
if (!found) {
await prisma.servicePersistentStorage.create({
data: { volumeName, path, containerId: s, predefined: true, service: { connect: { id } } }
});
}
}
}
}
}
await prisma.service.update({ where: { id }, data: { templateVersion: template.templateVersion } })
}
}
} catch (error) {
console.log(error)
}
}
async function appwrite(service: any, template: any) {
const { opensslKeyV1, executorSecret, mariadbHost, mariadbPort, mariadbUser, mariadbPassword, mariadbRootUserPassword, mariadbDatabase } = service.appwrite
const secrets = [
`_APP_EXECUTOR_SECRET@@@${executorSecret}`,
`_APP_OPENSSL_KEY_V1@@@${opensslKeyV1}`,
`_APP_DB_PASS@@@${mariadbPassword}`,
`_APP_DB_ROOT_PASS@@@${mariadbRootUserPassword}`,
]
const settings = [
`_APP_DB_HOST@@@${mariadbHost}`,
`_APP_DB_PORT@@@${mariadbPort}`,
`_APP_DB_USER@@@${mariadbUser}`,
`_APP_DB_SCHEMA@@@${mariadbDatabase}`,
]
await migrateSecrets(secrets, service);
await migrateSettings(settings, service, template);
// Disconnect old service data
// await prisma.service.update({ where: { id: service.id }, data: { appwrite: { disconnect: true } } })
}
async function weblate(service: any, template: any) {
const { adminPassword, postgresqlUser, postgresqlPassword, postgresqlDatabase } = service.weblate
const secrets = [
`WEBLATE_ADMIN_PASSWORD@@@${adminPassword}`,
`POSTGRES_PASSWORD@@@${postgresqlPassword}`,
]
const settings = [
`WEBLATE_SITE_DOMAIN@@@$$generate_domain`,
`POSTGRES_USER@@@${postgresqlUser}`,
`POSTGRES_DATABASE@@@${postgresqlDatabase}`,
`POSTGRES_DB@@@${postgresqlDatabase}`,
`POSTGRES_HOST@@@$$id-postgres`,
`POSTGRES_PORT@@@5432`,
`REDIS_HOST@@@$$id-redis`,
]
await migrateSettings(settings, service, template);
await migrateSecrets(secrets, service);
// Disconnect old service data
// await prisma.service.update({ where: { id: service.id }, data: { weblate: { disconnect: true } } })
}
async function searxng(service: any, template: any) {
const { secretKey, redisPassword } = service.searxng
const secrets = [
`SECRET_KEY@@@${secretKey}`,
`REDIS_PASSWORD@@@${redisPassword}`,
]
const settings = [
`SEARXNG_BASE_URL@@@$$generate_fqdn`
]
await migrateSettings(settings, service, template);
await migrateSecrets(secrets, service);
// Disconnect old service data
// await prisma.service.update({ where: { id: service.id }, data: { searxng: { disconnect: true } } })
}
async function glitchtip(service: any, template: any) {
const { postgresqlUser, postgresqlPassword, postgresqlDatabase, secretKeyBase, defaultEmail, defaultUsername, defaultPassword, defaultEmailFrom, emailSmtpHost, emailSmtpPort, emailSmtpUser, emailSmtpPassword, emailSmtpUseTls, emailSmtpUseSsl, emailBackend, mailgunApiKey, sendgridApiKey, enableOpenUserRegistration } = service.glitchTip
const { id } = service
const secrets = [
`POSTGRES_PASSWORD@@@${postgresqlPassword}`,
`SECRET_KEY@@@${secretKeyBase}`,
`MAILGUN_API_KEY@@@${mailgunApiKey}`,
`SENDGRID_API_KEY@@@${sendgridApiKey}`,
`DJANGO_SUPERUSER_PASSWORD@@@${defaultPassword}`,
emailSmtpUser && emailSmtpPassword && emailSmtpHost && emailSmtpPort && `EMAIL_URL@@@${encrypt(`smtp://${emailSmtpUser}:${decrypt(emailSmtpPassword)}@${emailSmtpHost}:${emailSmtpPort}`)} || ''`,
`DATABASE_URL@@@${encrypt(`postgres://${postgresqlUser}:${decrypt(postgresqlPassword)}@${id}-postgresql:5432/${postgresqlDatabase}`)}`,
`REDIS_URL@@@${encrypt(`redis://${id}-redis:6379`)}`
]
const settings = [
`POSTGRES_USER@@@${postgresqlUser}`,
`POSTGRES_DB@@@${postgresqlDatabase}`,
`DEFAULT_FROM_EMAIL@@@${defaultEmailFrom}`,
`EMAIL_USE_TLS@@@${emailSmtpUseTls}`,
`EMAIL_USE_SSL@@@${emailSmtpUseSsl}`,
`EMAIL_BACKEND@@@${emailBackend}`,
`ENABLE_OPEN_USER_REGISTRATION@@@${enableOpenUserRegistration}`,
`DJANGO_SUPERUSER_EMAIL@@@${defaultEmail}`,
`DJANGO_SUPERUSER_USERNAME@@@${defaultUsername}`,
]
await migrateSettings(settings, service, template);
await migrateSecrets(secrets, service);
await prisma.service.update({ where: { id: service.id }, data: { type: 'glitchtip' } })
// Disconnect old service data
// await prisma.service.update({ where: { id: service.id }, data: { glitchTip: { disconnect: true } } })
}
async function hasura(service: any, template: any) {
const { postgresqlUser, postgresqlPassword, postgresqlDatabase, graphQLAdminPassword } = service.hasura
const { id } = service
const secrets = [
`HASURA_GRAPHQL_ADMIN_PASSWORD@@@${graphQLAdminPassword}`,
`HASURA_GRAPHQL_METADATA_DATABASE_URL@@@${encrypt(`postgres://${postgresqlUser}:${decrypt(postgresqlPassword)}@${id}-postgresql:5432/${postgresqlDatabase}`)}`,
`POSTGRES_PASSWORD@@@${postgresqlPassword}`,
]
const settings = [
`POSTGRES_USER@@@${postgresqlUser}`,
`POSTGRES_DB@@@${postgresqlDatabase}`,
]
await migrateSettings(settings, service, template);
await migrateSecrets(secrets, service);
// Disconnect old service data
// await prisma.service.update({ where: { id: service.id }, data: { hasura: { disconnect: true } } })
}
async function umami(service: any, template: any) {
const { postgresqlUser, postgresqlPassword, postgresqlDatabase, umamiAdminPassword, hashSalt } = service.umami
const { id } = service
const secrets = [
`HASH_SALT@@@${hashSalt}`,
`POSTGRES_PASSWORD@@@${postgresqlPassword}`,
`ADMIN_PASSWORD@@@${umamiAdminPassword}`,
`DATABASE_URL@@@${encrypt(`postgres://${postgresqlUser}:${decrypt(postgresqlPassword)}@${id}-postgresql:5432/${postgresqlDatabase}`)}`,
]
const settings = [
`DATABASE_TYPE@@@postgresql`,
`POSTGRES_USER@@@${postgresqlUser}`,
`POSTGRES_DB@@@${postgresqlDatabase}`,
]
await migrateSettings(settings, service, template);
await migrateSecrets(secrets, service);
// Disconnect old service data
// await prisma.service.update({ where: { id: service.id }, data: { umami: { disconnect: true } } })
}
async function meilisearch(service: any, template: any) {
const { masterKey } = service.meiliSearch
const secrets = [
`MEILI_MASTER_KEY@@@${masterKey}`,
]
// await migrateSettings(settings, service, template);
await migrateSecrets(secrets, service);
// Disconnect old service data
// await prisma.service.update({ where: { id: service.id }, data: { meiliSearch: { disconnect: true } } })
}
async function ghost(service: any, template: any) {
const { defaultEmail, defaultPassword, mariadbUser, mariadbPassword, mariadbRootUser, mariadbRootUserPassword, mariadbDatabase } = service.ghost
const { fqdn } = service
const isHttps = fqdn.startsWith('https://');
const secrets = [
`GHOST_PASSWORD@@@${defaultPassword}`,
`MARIADB_PASSWORD@@@${mariadbPassword}`,
`MARIADB_ROOT_PASSWORD@@@${mariadbRootUserPassword}`,
`GHOST_DATABASE_PASSWORD@@@${mariadbPassword}`,
]
const settings = [
`GHOST_EMAIL@@@${defaultEmail}`,
`GHOST_DATABASE_HOST@@@${service.id}-mariadb`,
`GHOST_DATABASE_USER@@@${mariadbUser}`,
`GHOST_DATABASE_NAME@@@${mariadbDatabase}`,
`GHOST_DATABASE_PORT_NUMBER@@@3306`,
`MARIADB_USER@@@${mariadbUser}`,
`MARIADB_DATABASE@@@${mariadbDatabase}`,
`MARIADB_ROOT_USER@@@${mariadbRootUser}`,
`GHOST_HOST@@@$$generate_domain`,
`url@@@$$generate_fqdn`,
`GHOST_ENABLE_HTTPS@@@${isHttps ? 'yes' : 'no'}`
]
await migrateSettings(settings, service, template);
await migrateSecrets(secrets, service);
await prisma.service.update({ where: { id: service.id }, data: { type: "ghost-mariadb" } })
// Disconnect old service data
// await prisma.service.update({ where: { id: service.id }, data: { ghost: { disconnect: true } } })
}
async function wordpress(service: any, template: any) {
const { extraConfig, tablePrefix, ownMysql, mysqlHost, mysqlPort, mysqlUser, mysqlPassword, mysqlRootUser, mysqlRootUserPassword, mysqlDatabase, ftpEnabled, ftpUser, ftpPassword, ftpPublicPort, ftpHostKey, ftpHostKeyPrivate } = service.wordpress
let settings = []
let secrets = []
if (ownMysql) {
secrets = [
`WORDPRESS_DB_PASSWORD@@@${mysqlPassword}`,
ftpPassword && `COOLIFY_FTP_PASSWORD@@@${ftpPassword}`,
ftpHostKeyPrivate && `COOLIFY_FTP_HOST_KEY_PRIVATE@@@${ftpHostKeyPrivate}`,
ftpHostKey && `COOLIFY_FTP_HOST_KEY@@@${ftpHostKey}`,
]
settings = [
`WORDPRESS_CONFIG_EXTRA@@@${extraConfig}`,
`WORDPRESS_DB_HOST@@@${mysqlHost}`,
`WORDPRESS_DB_PORT@@@${mysqlPort}`,
`WORDPRESS_DB_USER@@@${mysqlUser}`,
`WORDPRESS_DB_NAME@@@${mysqlDatabase}`,
]
} else {
secrets = [
`MYSQL_ROOT_PASSWORD@@@${mysqlRootUserPassword}`,
`MYSQL_PASSWORD@@@${mysqlPassword}`,
ftpPassword && `COOLIFY_FTP_PASSWORD@@@${ftpPassword}`,
ftpHostKeyPrivate && `COOLIFY_FTP_HOST_KEY_PRIVATE@@@${ftpHostKeyPrivate}`,
ftpHostKey && `COOLIFY_FTP_HOST_KEY@@@${ftpHostKey}`,
]
settings = [
`MYSQL_ROOT_USER@@@${mysqlRootUser}`,
`MYSQL_USER@@@${mysqlUser}`,
`MYSQL_DATABASE@@@${mysqlDatabase}`,
`MYSQL_HOST@@@${service.id}-mysql`,
`MYSQL_PORT@@@${mysqlPort}`,
`WORDPRESS_CONFIG_EXTRA@@@${extraConfig}`,
`WORDPRESS_TABLE_PREFIX@@@${tablePrefix}`,
`WORDPRESS_DB_HOST@@@${service.id}-mysql`,
`COOLIFY_OWN_DB@@@${ownMysql}`,
`COOLIFY_FTP_ENABLED@@@${ftpEnabled}`,
`COOLIFY_FTP_USER@@@${ftpUser}`,
`COOLIFY_FTP_PUBLIC_PORT@@@${ftpPublicPort}`,
]
}
await migrateSettings(settings, service, template);
await migrateSecrets(secrets, service);
if (ownMysql) {
await prisma.service.update({ where: { id: service.id }, data: { type: "wordpress-only" } })
}
// Disconnect old service data
// await prisma.service.update({ where: { id: service.id }, data: { wordpress: { disconnect: true } } })
}
async function vscodeserver(service: any, template: any) {
const { password } = service.vscodeserver
const secrets = [
`PASSWORD@@@${password}`,
]
await migrateSecrets(secrets, service);
// Disconnect old service data
// await prisma.service.update({ where: { id: service.id }, data: { vscodeserver: { disconnect: true } } })
}
async function minio(service: any, template: any) {
const { rootUser, rootUserPassword, apiFqdn } = service.minio
const secrets = [
`MINIO_ROOT_PASSWORD@@@${rootUserPassword}`,
]
const settings = [
`MINIO_ROOT_USER@@@${rootUser}`,
`MINIO_SERVER_URL@@@${apiFqdn}`,
`MINIO_BROWSER_REDIRECT_URL@@@$$generate_fqdn`,
`MINIO_DOMAIN@@@$$generate_domain`,
]
await migrateSettings(settings, service, template);
await migrateSecrets(secrets, service);
// Disconnect old service data
// await prisma.service.update({ where: { id: service.id }, data: { minio: { disconnect: true } } })
}
async function fider(service: any, template: any) {
const { postgresqlUser, postgresqlPassword, postgresqlDatabase, jwtSecret, emailNoreply, emailMailgunApiKey, emailMailgunDomain, emailMailgunRegion, emailSmtpHost, emailSmtpPort, emailSmtpUser, emailSmtpPassword, emailSmtpEnableStartTls } = service.fider
const { id } = service
const secrets = [
`JWT_SECRET@@@${jwtSecret}`,
emailMailgunApiKey && `EMAIL_MAILGUN_API@@@${emailMailgunApiKey}`,
emailSmtpPassword && `EMAIL_SMTP_PASSWORD@@@${emailSmtpPassword}`,
`POSTGRES_PASSWORD@@@${postgresqlPassword}`,
`DATABASE_URL@@@${encrypt(`postgresql://${postgresqlUser}:${decrypt(postgresqlPassword)}@${id}-postgresql:5432/${postgresqlDatabase}?sslmode=disable`)}`
]
const settings = [
`BASE_URL@@@$$generate_fqdn`,
`EMAIL_NOREPLY@@@${emailNoreply || 'noreply@example.com'}`,
`EMAIL_MAILGUN_DOMAIN@@@${emailMailgunDomain || ''}`,
`EMAIL_MAILGUN_REGION@@@${emailMailgunRegion || ''}`,
`EMAIL_SMTP_HOST@@@${emailSmtpHost || ''}`,
`EMAIL_SMTP_PORT@@@${emailSmtpPort || 587}`,
`EMAIL_SMTP_USER@@@${emailSmtpUser || ''}`,
`EMAIL_SMTP_PASSWORD@@@${emailSmtpPassword || ''}`,
`EMAIL_SMTP_ENABLE_STARTTLS@@@${emailSmtpEnableStartTls || 'false'}`,
`POSTGRES_USER@@@${postgresqlUser}`,
`POSTGRES_DB@@@${postgresqlDatabase}`,
]
await migrateSettings(settings, service, template);
await migrateSecrets(secrets, service);
// Disconnect old service data
// await prisma.service.update({ where: { id: service.id }, data: { fider: { disconnect: true } } })
}
async function plausibleAnalytics(service: any, template: any) {
const { email, username, password, postgresqlUser, postgresqlPassword, postgresqlDatabase, secretKeyBase, scriptName } = service.plausibleAnalytics;
const { id } = service
const settings = [
`BASE_URL@@@$$generate_fqdn`,
`ADMIN_USER_EMAIL@@@${email}`,
`ADMIN_USER_NAME@@@${username}`,
`DISABLE_AUTH@@@false`,
`DISABLE_REGISTRATION@@@true`,
`POSTGRESQL_USERNAME@@@${postgresqlUser}`,
`POSTGRESQL_DATABASE@@@${postgresqlDatabase}`,
`SCRIPT_NAME@@@${scriptName}`,
]
const secrets = [
`ADMIN_USER_PWD@@@${password}`,
`SECRET_KEY_BASE@@@${secretKeyBase}`,
`POSTGRESQL_PASSWORD@@@${postgresqlPassword}`,
`DATABASE_URL@@@${encrypt(`postgres://${postgresqlUser}:${decrypt(postgresqlPassword)}@${id}-postgresql:5432/${postgresqlDatabase}`)}`,
]
await migrateSettings(settings, service, template);
await migrateSecrets(secrets, service);
// Disconnect old service data
// await prisma.service.update({ where: { id: service.id }, data: { plausibleAnalytics: { disconnect: true } } })
}
async function migrateSettings(settings: any[], service: any, template: any) {
for (const setting of settings) {
if (!setting) continue;
let [name, value] = setting.split('@@@')
let minio = name
if (name === 'MINIO_SERVER_URL') {
name = 'coolify_fqdn_minio_console'
}
if (!value || value === 'null') {
continue;
}
let variableName = template.variables.find((v: any) => v.name === name)?.id
if (!variableName) {
variableName = `$$config_${name.toLowerCase()}`
}
// console.log('Migrating setting', name, value, 'for service', service.id, ', service name:', service.name, 'variableName: ', variableName)
await prisma.serviceSetting.findFirst({ where: { name: minio, serviceId: service.id } }) || await prisma.serviceSetting.create({ data: { name: minio, value, variableName, service: { connect: { id: service.id } } } })
}
}
async function migrateSecrets(secrets: any[], service: any) {
for (const secret of secrets) {
if (!secret) continue;
let [name, value] = secret.split('@@@')
if (!value || value === 'null') {
continue
}
// console.log('Migrating secret', name, value, 'for service', service.id, ', service name:', service.name)
await prisma.serviceSecret.findFirst({ where: { name, serviceId: service.id } }) || await prisma.serviceSecret.create({ data: { name, value, service: { connect: { id: service.id } } } })
}
}
async function createVolumes(service: any, template: any) {
const volumes = [];
for (const s of Object.keys(template.services)) {
if (template.services[s].volumes && template.services[s].volumes.length > 0) {
for (const volume of template.services[s].volumes) {
let volumeName = volume.split(':')[0]
const volumePath = volume.split(':')[1]
let volumeService = s
if (service.type === 'plausibleanalytics' && service.plausibleAnalytics?.id) {
let volumeId = volumeName.split('-')[0]
volumeName = volumeName.replace(volumeId, service.plausibleAnalytics.id)
}
volumes.push(`${volumeName}@@@${volumePath}@@@${volumeService}`)
}
}
}
for (const volume of volumes) {
const [volumeName, path, containerId] = volume.split('@@@')
// console.log('Creating volume', volumeName, path, containerId, 'for service', service.id, ', service name:', service.name)
await prisma.servicePersistentStorage.findFirst({ where: { volumeName, serviceId: service.id } }) || await prisma.servicePersistentStorage.create({ data: { volumeName, path, containerId, predefined: true, service: { connect: { id: service.id } } } })
}
}

View File

@@ -1,4 +1,4 @@
import { base64Encode, executeDockerCmd, generateTimestamp, getDomain, isDev, prisma, version } from "../common";
import { base64Encode, encrypt, executeDockerCmd, generateTimestamp, getDomain, isDev, prisma, version } from "../common";
import { promises as fs } from 'fs';
import { day } from "../dayjs";
@@ -342,13 +342,13 @@ export function setDefaultBaseImage(buildPack: string | null, deploymentType: st
}
if (buildPack === 'laravel') {
payload.baseImage = 'webdevops/php-apache:8.2-alpine';
payload.baseImages = phpVersions;
payload.baseBuildImage = 'node:18';
payload.baseBuildImages = nodeVersions;
}
if (buildPack === 'heroku') {
payload.baseImage = 'heroku/buildpacks:20';
payload.baseImages = herokuVersions;
}
return payload;
}
@@ -384,7 +384,7 @@ export const setDefaultConfiguration = async (data: any) => {
if (!publishDirectory) publishDirectory = template?.publishDirectory || null;
if (baseDirectory) {
if (!baseDirectory.startsWith('/')) baseDirectory = `/${baseDirectory}`;
if (!baseDirectory.endsWith('/')) baseDirectory = `${baseDirectory}/`;
if (baseDirectory.endsWith('/') && baseDirectory !== '/') baseDirectory = baseDirectory.slice(0, -1);
}
if (dockerFileLocation) {
if (!dockerFileLocation.startsWith('/')) dockerFileLocation = `/${dockerFileLocation}`;
@@ -461,17 +461,32 @@ export const saveBuildLog = async ({
buildId: string;
applicationId: string;
}): Promise<any> => {
const { default: got } = await import('got')
if (line && typeof line === 'string' && line.includes('ghs_')) {
const regex = /ghs_.*@/g;
line = line.replace(regex, '<SENSITIVE_DATA_DELETED>@');
}
const addTimestamp = `[${generateTimestamp()}] ${line}`;
if (isDev) console.debug(`[${applicationId}] ${addTimestamp}`);
return await prisma.buildLog.create({
data: {
line: addTimestamp, buildId, time: Number(day().valueOf()), applicationId
}
});
const fluentBitUrl = isDev ? process.env.COOLIFY_CONTAINER_DEV === 'true' ? 'http://coolify-fluentbit:24224' : 'http://localhost:24224' : 'http://coolify-fluentbit:24224';
if (isDev && !process.env.COOLIFY_CONTAINER_DEV) {
console.debug(`[${applicationId}] ${addTimestamp}`);
}
try {
return await got.post(`${fluentBitUrl}/${applicationId}_buildlog_${buildId}.csv`, {
json: {
line: encrypt(line)
}
})
} catch (error) {
return await prisma.buildLog.create({
data: {
line: addTimestamp, buildId, time: Number(day().valueOf()), applicationId
}
});
}
};
export async function copyBaseConfigurationFiles(
@@ -556,7 +571,6 @@ export function checkPnpm(installCommand = null, buildCommand = null, startComma
);
}
export async function buildImage({
applicationId,
tag,
@@ -565,23 +579,26 @@ export async function buildImage({
dockerId,
isCache = false,
debug = false,
dockerFileLocation = '/Dockerfile'
dockerFileLocation = '/Dockerfile',
commit
}) {
if (isCache) {
await saveBuildLog({ line: `Building cache image started.`, buildId, applicationId });
} else {
await saveBuildLog({ line: `Building image started.`, buildId, applicationId });
}
if (!debug && isCache) {
if (!debug) {
await saveBuildLog({
line: `Debug turned off. To see more details, allow it in the configuration.`,
line: `Debug turned off. To see more details, allow it in the features tab.`,
buildId,
applicationId
});
}
const dockerFile = isCache ? `${dockerFileLocation}-cache` : `${dockerFileLocation}`
const cache = `${applicationId}:${tag}${isCache ? '-cache' : ''}`
await executeDockerCmd({ debug, buildId, applicationId, dockerId, command: `docker build --progress plain -f ${workdir}/${dockerFile} -t ${cache} ${workdir}` })
await executeDockerCmd({ debug, buildId, applicationId, dockerId, command: `docker build --progress plain -f ${workdir}/${dockerFile} -t ${cache} --build-arg SOURCE_COMMIT=${commit} ${workdir}` })
const { status } = await prisma.build.findUnique({ where: { id: buildId } })
if (status === 'canceled') {
throw new Error('Deployment canceled.')
@@ -593,30 +610,6 @@ export async function buildImage({
}
}
export async function streamEvents({ stream, docker, buildId, applicationId, debug }) {
await new Promise((resolve, reject) => {
docker.engine.modem.followProgress(stream, onFinished, onProgress);
function onFinished(err, res) {
if (err) reject(err);
resolve(res);
}
async function onProgress(event) {
if (event.error) {
reject(event.error);
} else if (event.stream) {
if (event.stream !== '\n') {
if (debug)
await saveBuildLog({
line: `${event.stream.replace('\n', '')}`,
buildId,
applicationId
});
}
}
}
});
}
export function makeLabelForStandaloneApplication({
applicationId,
fqdn,
@@ -643,6 +636,7 @@ export function makeLabelForStandaloneApplication({
return [
'coolify.managed=true',
`coolify.version=${version}`,
`coolify.applicationId=${applicationId}`,
`coolify.type=standalone-application`,
`coolify.configuration=${base64Encode(
JSON.stringify({
@@ -677,8 +671,6 @@ export async function buildCacheImageWithNode(data, imageForBuild) {
secrets,
pullmergeRequestId
} = data;
const isPnpm = checkPnpm(installCommand, buildCommand);
const Dockerfile: Array<string> = [];
Dockerfile.push(`FROM ${imageForBuild}`);
@@ -688,7 +680,10 @@ export async function buildCacheImageWithNode(data, imageForBuild) {
secrets.forEach((secret) => {
if (secret.isBuildSecret) {
if (pullmergeRequestId) {
if (secret.isPRMRSecret) {
const isSecretFound = secrets.filter(s => s.name === secret.name && s.isPRMRSecret)
if (isSecretFound.length > 0) {
Dockerfile.push(`ARG ${secret.name}=${isSecretFound[0].value}`);
} else {
Dockerfile.push(`ARG ${secret.name}=${secret.value}`);
}
} else {
@@ -706,6 +701,7 @@ export async function buildCacheImageWithNode(data, imageForBuild) {
if (installCommand) {
Dockerfile.push(`RUN ${installCommand}`);
}
// Dockerfile.push(`ARG CACHEBUST=1`);
Dockerfile.push(`RUN ${buildCommand}`);
await fs.writeFile(`${workdir}/Dockerfile-cache`, Dockerfile.join('\n'));
await buildImage({ ...data, isCache: true });
@@ -722,7 +718,10 @@ export async function buildCacheImageForLaravel(data, imageForBuild) {
secrets.forEach((secret) => {
if (secret.isBuildSecret) {
if (pullmergeRequestId) {
if (secret.isPRMRSecret) {
const isSecretFound = secrets.filter(s => s.name === secret.name && s.isPRMRSecret)
if (isSecretFound.length > 0) {
Dockerfile.push(`ARG ${secret.name}=${isSecretFound[0].value}`);
} else {
Dockerfile.push(`ARG ${secret.name}=${secret.value}`);
}
} else {
@@ -762,4 +761,4 @@ export async function buildCacheImageWithCargo(data, imageForBuild) {
Dockerfile.push('RUN cargo chef cook --release --recipe-path recipe.json');
await fs.writeFile(`${workdir}/Dockerfile-cache`, Dockerfile.join('\n'));
await buildImage({ ...data, isCache: true });
}
}

View File

@@ -0,0 +1,100 @@
import { promises as fs } from 'fs';
import { defaultComposeConfiguration, executeDockerCmd } from '../common';
import { buildImage, saveBuildLog } from './common';
import yaml from 'js-yaml';
export default async function (data) {
let {
applicationId,
debug,
buildId,
dockerId,
network,
volumes,
labels,
workdir,
baseDirectory,
secrets,
pullmergeRequestId,
port,
dockerComposeConfiguration
} = data
const fileYml = `${workdir}${baseDirectory}/docker-compose.yml`;
const fileYaml = `${workdir}${baseDirectory}/docker-compose.yaml`;
let dockerComposeRaw = null;
let isYml = false;
try {
dockerComposeRaw = await fs.readFile(`${fileYml}`, 'utf8')
isYml = true
} catch (error) { }
try {
dockerComposeRaw = await fs.readFile(`${fileYaml}`, 'utf8')
} catch (error) { }
if (!dockerComposeRaw) {
throw ('docker-compose.yml or docker-compose.yaml are not found!');
}
const dockerComposeYaml = yaml.load(dockerComposeRaw)
if (!dockerComposeYaml.services) {
throw 'No Services found in docker-compose file.'
}
const envs = [
`PORT=${port}`
];
if (secrets.length > 0) {
secrets.forEach((secret) => {
if (pullmergeRequestId) {
const isSecretFound = secrets.filter(s => s.name === secret.name && s.isPRMRSecret)
if (isSecretFound.length > 0) {
envs.push(`${secret.name}=${isSecretFound[0].value}`);
} else {
envs.push(`${secret.name}=${secret.value}`);
}
} else {
if (!secret.isPRMRSecret) {
envs.push(`${secret.name}=${secret.value}`);
}
}
});
}
await fs.writeFile(`${workdir}/.env`, envs.join('\n'));
let envFound = false;
try {
envFound = !!(await fs.stat(`${workdir}/.env`));
} catch (error) {
//
}
const composeVolumes = volumes.map((volume) => {
return {
[`${volume.split(':')[0]}`]: {
name: volume.split(':')[0]
}
};
});
let networks = {}
for (let [key, value] of Object.entries(dockerComposeYaml.services)) {
value['container_name'] = `${applicationId}-${key}`
value['env_file'] = envFound ? [`${workdir}/.env`] : []
value['labels'] = labels
value['volumes'] = volumes
if (dockerComposeConfiguration[key].port) {
value['expose'] = [dockerComposeConfiguration[key].port]
}
if (value['networks']?.length > 0) {
value['networks'].forEach((network) => {
networks[network] = {
name: network
}
})
}
value['networks'] = [...value['networks'] || '', network]
dockerComposeYaml.services[key] = { ...dockerComposeYaml.services[key], restart: defaultComposeConfiguration(network).restart, deploy: defaultComposeConfiguration(network).deploy }
}
dockerComposeYaml['volumes'] = Object.assign({ ...dockerComposeYaml['volumes'] }, ...composeVolumes)
dockerComposeYaml['networks'] = Object.assign({ ...networks }, { [network]: { external: true } })
await fs.writeFile(`${workdir}/docker-compose.${isYml ? 'yml' : 'yaml'}`, yaml.dump(dockerComposeYaml));
await executeDockerCmd({ debug, buildId, applicationId, dockerId, command: `docker compose --project-directory ${workdir} pull` })
await saveBuildLog({ line: 'Pulling images from Compose file.', buildId, applicationId });
await executeDockerCmd({ debug, buildId, applicationId, dockerId, command: `docker compose --project-directory ${workdir} build --progress plain` })
await saveBuildLog({ line: 'Building images from Compose file.', buildId, applicationId });
}

View File

@@ -27,7 +27,10 @@ const createDockerfile = async (data, image): Promise<void> => {
secrets.forEach((secret) => {
if (secret.isBuildSecret) {
if (pullmergeRequestId) {
if (secret.isPRMRSecret) {
const isSecretFound = secrets.filter(s => s.name === secret.name && s.isPRMRSecret)
if (isSecretFound.length > 0) {
Dockerfile.push(`ARG ${secret.name}=${isSecretFound[0].value}`);
} else {
Dockerfile.push(`ARG ${secret.name}=${secret.value}`);
}
} else {
@@ -46,7 +49,7 @@ const createDockerfile = async (data, image): Promise<void> => {
Dockerfile.push(`RUN deno cache ${denoMainFile}`);
Dockerfile.push(`ENV NO_COLOR true`);
Dockerfile.push(`EXPOSE ${port}`);
Dockerfile.push(`CMD deno run ${denoOptions ? denoOptions.split(' ') : ''} ${denoMainFile}`);
Dockerfile.push(`CMD deno run ${denoOptions || ''} ${denoMainFile}`);
await fs.writeFile(`${workdir}/Dockerfile`, Dockerfile.join('\n'));
};

View File

@@ -13,40 +13,33 @@ export default async function (data) {
pullmergeRequestId,
dockerFileLocation
} = data
try {
const file = `${workdir}${dockerFileLocation}`;
let dockerFileOut = `${workdir}`;
if (baseDirectory) {
dockerFileOut = `${workdir}${baseDirectory}`;
workdir = `${workdir}${baseDirectory}`;
}
const Dockerfile: Array<string> = (await fs.readFile(`${file}`, 'utf8'))
.toString()
.trim()
.split('\n');
Dockerfile.push(`LABEL coolify.buildId=${buildId}`);
if (secrets.length > 0) {
secrets.forEach((secret) => {
if (secret.isBuildSecret) {
if (
(pullmergeRequestId && secret.isPRMRSecret) ||
(!pullmergeRequestId && !secret.isPRMRSecret)
) {
Dockerfile.unshift(`ARG ${secret.name}=${secret.value}`);
const file = `${workdir}${baseDirectory}${dockerFileLocation}`;
data.workdir = `${workdir}${baseDirectory}`;
const DockerfileRaw = await fs.readFile(`${file}`, 'utf8')
const Dockerfile: Array<string> = DockerfileRaw
.toString()
.trim()
.split('\n');
Dockerfile.push(`LABEL coolify.buildId=${buildId}`);
if (secrets.length > 0) {
secrets.forEach((secret) => {
if (secret.isBuildSecret) {
if (
(pullmergeRequestId && secret.isPRMRSecret) ||
(!pullmergeRequestId && !secret.isPRMRSecret)
) {
Dockerfile.unshift(`ARG ${secret.name}=${secret.value}`);
Dockerfile.forEach((line, index) => {
if (line.startsWith('FROM')) {
Dockerfile.splice(index + 1, 0, `ARG ${secret.name}`);
}
});
}
Dockerfile.forEach((line, index) => {
if (line.startsWith('FROM')) {
Dockerfile.splice(index + 1, 0, `ARG ${secret.name}`);
}
});
}
});
}
await fs.writeFile(`${dockerFileOut}${dockerFileLocation}`, Dockerfile.join('\n'));
await buildImage(data);
} catch (error) {
throw error;
}
});
}
await fs.writeFile(`${workdir}${dockerFileLocation}`, Dockerfile.join('\n'));
await buildImage(data);
}

View File

@@ -2,38 +2,17 @@ import { executeDockerCmd, prisma } from "../common"
import { saveBuildLog } from "./common";
export default async function (data: any): Promise<void> {
const { buildId, applicationId, tag, dockerId, debug, workdir } = data
const { buildId, applicationId, tag, dockerId, debug, workdir, baseDirectory, baseImage } = data
try {
await saveBuildLog({ line: `Building image started.`, buildId, applicationId });
const { stdout } = await executeDockerCmd({
await executeDockerCmd({
buildId,
debug,
dockerId,
command: `pack build -p ${workdir} ${applicationId}:${tag} --builder heroku/buildpacks:20`
command: `pack build -p ${workdir}${baseDirectory} ${applicationId}:${tag} --builder ${baseImage}`
})
if (debug) {
const array = stdout.split('\n')
for (const line of array) {
if (line !== '\n') {
await saveBuildLog({
line: `${line.replace('\n', '')}`,
buildId,
applicationId
});
}
}
}
await saveBuildLog({ line: `Building image successful.`, buildId, applicationId });
} catch (error) {
const array = error.stdout.split('\n')
for (const line of array) {
if (line !== '\n') {
await saveBuildLog({
line: `${line.replace('\n', '')}`,
buildId,
applicationId
});
}
}
throw error;
}
}

View File

@@ -16,6 +16,7 @@ import python from './python';
import deno from './deno';
import laravel from './laravel';
import heroku from './heroku';
import compose from './compose'
export {
node,
@@ -35,5 +36,6 @@ export {
python,
deno,
laravel,
heroku
heroku,
compose
};

View File

@@ -27,7 +27,10 @@ const createDockerfile = async (data, image): Promise<void> => {
secrets.forEach((secret) => {
if (secret.isBuildSecret) {
if (pullmergeRequestId) {
if (secret.isPRMRSecret) {
const isSecretFound = secrets.filter(s => s.name === secret.name && s.isPRMRSecret)
if (isSecretFound.length > 0) {
Dockerfile.push(`ARG ${secret.name}=${isSecretFound[0].value}`);
} else {
Dockerfile.push(`ARG ${secret.name}=${secret.value}`);
}
} else {

View File

@@ -23,7 +23,10 @@ const createDockerfile = async (data, image): Promise<void> => {
secrets.forEach((secret) => {
if (secret.isBuildSecret) {
if (pullmergeRequestId) {
if (secret.isPRMRSecret) {
const isSecretFound = secrets.filter(s => s.name === secret.name && s.isPRMRSecret)
if (isSecretFound.length > 0) {
Dockerfile.push(`ARG ${secret.name}=${isSecretFound[0].value}`);
} else {
Dockerfile.push(`ARG ${secret.name}=${secret.value}`);
}
} else {

View File

@@ -27,7 +27,10 @@ const createDockerfile = async (data, image): Promise<void> => {
secrets.forEach((secret) => {
if (secret.isBuildSecret) {
if (pullmergeRequestId) {
if (secret.isPRMRSecret) {
const isSecretFound = secrets.filter(s => s.name === secret.name && s.isPRMRSecret)
if (isSecretFound.length > 0) {
Dockerfile.push(`ARG ${secret.name}=${isSecretFound[0].value}`);
} else {
Dockerfile.push(`ARG ${secret.name}=${secret.value}`);
}
} else {

View File

@@ -16,7 +16,10 @@ const createDockerfile = async (data, image, htaccessFound): Promise<void> => {
secrets.forEach((secret) => {
if (secret.isBuildSecret) {
if (pullmergeRequestId) {
if (secret.isPRMRSecret) {
const isSecretFound = secrets.filter(s => s.name === secret.name && s.isPRMRSecret)
if (isSecretFound.length > 0) {
Dockerfile.push(`ARG ${secret.name}=${isSecretFound[0].value}`);
} else {
Dockerfile.push(`ARG ${secret.name}=${secret.value}`);
}
} else {

View File

@@ -21,7 +21,10 @@ const createDockerfile = async (data, image): Promise<void> => {
secrets.forEach((secret) => {
if (secret.isBuildSecret) {
if (pullmergeRequestId) {
if (secret.isPRMRSecret) {
const isSecretFound = secrets.filter(s => s.name === secret.name && s.isPRMRSecret)
if (isSecretFound.length > 0) {
Dockerfile.push(`ARG ${secret.name}=${isSecretFound[0].value}`);
} else {
Dockerfile.push(`ARG ${secret.name}=${secret.value}`);
}
} else {

View File

@@ -24,7 +24,10 @@ const createDockerfile = async (data, image): Promise<void> => {
secrets.forEach((secret) => {
if (secret.isBuildSecret) {
if (pullmergeRequestId) {
if (secret.isPRMRSecret) {
const isSecretFound = secrets.filter(s => s.name === secret.name && s.isPRMRSecret)
if (isSecretFound.length > 0) {
Dockerfile.push(`ARG ${secret.name}=${isSecretFound[0].value}`);
} else {
Dockerfile.push(`ARG ${secret.name}=${secret.value}`);
}
} else {

File diff suppressed because it is too large Load Diff

View File

@@ -13,7 +13,7 @@ export function formatLabelsOnDocker(data) {
return container
})
}
export async function checkContainer({ dockerId, container, remove = false }: { dockerId: string, container: string, remove?: boolean }): Promise<boolean> {
export async function checkContainer({ dockerId, container, remove = false }: { dockerId: string, container: string, remove?: boolean }): Promise<{ found: boolean, status?: { isExited: boolean, isRunning: boolean, isRestarting: boolean } }> {
let containerFound = false;
try {
const { stdout } = await executeDockerCmd({
@@ -21,10 +21,12 @@ export async function checkContainer({ dockerId, container, remove = false }: {
command:
`docker inspect --format '{{json .State}}' ${container}`
});
containerFound = true
const parsedStdout = JSON.parse(stdout);
const status = parsedStdout.Status;
const isRunning = status === 'running';
const isRestarting = status === 'restarting'
const isExited = status === 'exited'
if (status === 'created') {
await executeDockerCmd({
dockerId,
@@ -39,13 +41,23 @@ export async function checkContainer({ dockerId, container, remove = false }: {
`docker rm ${container}`
});
}
if (isRunning) {
containerFound = true;
}
return {
found: containerFound,
status: {
isRunning,
isRestarting,
isExited
}
};
} catch (err) {
// Container not found
}
return containerFound;
return {
found: false
};
}
export async function isContainerExited(dockerId: string, containerName: string): Promise<boolean> {
@@ -75,6 +87,9 @@ export async function removeContainer({
await executeDockerCmd({ dockerId, command: `docker stop -t 0 ${id}` })
await executeDockerCmd({ dockerId, command: `docker rm ${id}` })
}
if (JSON.parse(stdout).Status === 'exited') {
await executeDockerCmd({ dockerId, command: `docker rm ${id}` })
}
} catch (error) {
throw error;
}

View File

@@ -73,6 +73,4 @@ export default async function ({
const { stdout: commit } = await asyncExecShell(`cd ${workdir}/ && git rev-parse HEAD`);
return commit.replace('\n', '');
}

View File

@@ -10,7 +10,8 @@ export default async function ({
branch,
buildId,
privateSshKey,
customPort
customPort,
forPublic
}: {
applicationId: string;
workdir: string;
@@ -21,11 +22,15 @@ export default async function ({
repodir: string;
privateSshKey: string;
customPort: number;
forPublic: boolean;
}): Promise<string> {
const url = htmlUrl.replace('https://', '').replace('http://', '').replace(/\/$/, '');
await saveBuildLog({ line: 'GitLab importer started.', buildId, applicationId });
await asyncExecShell(`echo '${privateSshKey}' > ${repodir}/id.rsa`);
await asyncExecShell(`chmod 600 ${repodir}/id.rsa`);
if (!forPublic) {
await asyncExecShell(`echo '${privateSshKey}' > ${repodir}/id.rsa`);
await asyncExecShell(`chmod 600 ${repodir}/id.rsa`);
}
await saveBuildLog({
line: `Cloning ${repository}:${branch} branch.`,
@@ -33,9 +38,16 @@ export default async function ({
applicationId
});
await asyncExecShell(
`git clone -q -b ${branch} git@${url}:${repository}.git --config core.sshCommand="ssh -p ${customPort} -q -i ${repodir}id.rsa -o StrictHostKeyChecking=no" ${workdir}/ && cd ${workdir}/ && git submodule update --init --recursive && git lfs pull && cd .. `
);
if (forPublic) {
await asyncExecShell(
`git clone -q -b ${branch} https://${url}/${repository}.git ${workdir}/ && cd ${workdir}/ && git submodule update --init --recursive && git lfs pull && cd .. `
);
} else {
await asyncExecShell(
`git clone -q -b ${branch} git@${url}:${repository}.git --config core.sshCommand="ssh -p ${customPort} -q -i ${repodir}id.rsa -o StrictHostKeyChecking=no" ${workdir}/ && cd ${workdir}/ && git submodule update --init --recursive && git lfs pull && cd .. `
);
}
const { stdout: commit } = await asyncExecShell(`cd ${workdir}/ && git rev-parse HEAD`);
return commit.replace('\n', '');
}

View File

@@ -9,17 +9,16 @@ Bree.extend(TSBree);
const options: any = {
defaultExtension: 'js',
// logger: new Cabin(),
logger: false,
workerMessageHandler: async ({ name, message }) => {
if (name === 'deployApplication' && message?.deploying) {
if (scheduler.workers.has('autoUpdater') || scheduler.workers.has('cleanupStorage')) {
scheduler.workers.get('deployApplication').postMessage('cancel')
}
}
},
logger: new Cabin(),
// logger: false,
// workerMessageHandler: async ({ name, message }) => {
// if (name === 'deployApplication' && message?.deploying) {
// if (scheduler.workers.has('autoUpdater') || scheduler.workers.has('cleanupStorage')) {
// scheduler.workers.get('deployApplication').postMessage('cancel')
// }
// }
// },
jobs: [
{ name: 'infrastructure' },
{ name: 'deployApplication' },
],
};

View File

@@ -1,20 +1,170 @@
import { createDirectories, getServiceFromDB, getServiceImage, getServiceMainPort, makeLabelForServices } from "./common";
export async function defaultServiceConfigurations({ id, teamId }) {
const service = await getServiceFromDB({ id, teamId });
const { destinationDockerId, destinationDocker, type, serviceSecret } = service;
const network = destinationDockerId && destinationDocker.network;
const port = getServiceMainPort(type);
const { workdir } = await createDirectories({ repository: type, buildId: id });
const image = getServiceImage(type);
let secrets = [];
if (serviceSecret.length > 0) {
serviceSecret.forEach((secret) => {
secrets.push(`${secret.name}=${secret.value}`);
});
import { isDev } from "./common";
import fs from 'fs/promises';
export async function getTemplates() {
let templates: any = [];
if (isDev) {
templates = JSON.parse(await (await fs.readFile('./templates.json')).toString())
} else {
templates = JSON.parse(await (await fs.readFile('/app/templates.json')).toString())
}
return { ...service, network, port, workdir, image, secrets }
}
// if (!isDev) {
// templates.push({
// "templateVersion": "1.0.0",
// "defaultVersion": "latest",
// "name": "Test-Fake-Service",
// "description": "",
// "services": {
// "$$id": {
// "name": "Test-Fake-Service",
// "depends_on": [
// "$$id-postgresql",
// "$$id-redis"
// ],
// "image": "weblate/weblate:$$core_version",
// "volumes": [
// "$$id-data:/app/data",
// ],
// "environment": [
// `POSTGRES_SECRET=$$secret_postgres_secret`,
// `WEBLATE_SITE_DOMAIN=$$config_weblate_site_domain`,
// `WEBLATE_ADMIN_PASSWORD=$$secret_weblate_admin_password`,
// `POSTGRES_PASSWORD=$$secret_postgres_password`,
// `POSTGRES_USER=$$config_postgres_user`,
// `POSTGRES_DATABASE=$$config_postgres_db`,
// `POSTGRES_HOST=$$id-postgresql`,
// `POSTGRES_PORT=5432`,
// `REDIS_HOST=$$id-redis`,
// ],
// "ports": [
// "8080"
// ]
// },
// "$$id-postgresql": {
// "name": "PostgreSQL",
// "depends_on": [],
// "image": "postgres:14-alpine",
// "volumes": [
// "$$id-postgresql-data:/var/lib/postgresql/data",
// ],
// "environment": [
// "POSTGRES_USER=$$config_postgres_user",
// "POSTGRES_PASSWORD=$$secret_postgres_password",
// "POSTGRES_DB=$$config_postgres_db",
// ],
// "ports": []
// },
// "$$id-redis": {
// "name": "Redis",
// "depends_on": [],
// "image": "redis:7-alpine",
// "volumes": [
// "$$id-redis-data:/data",
// ],
// "environment": [],
// "ports": [],
// }
// },
// "variables": [
// {
// "id": "$$config_weblate_site_domain",
// "main": "$$id",
// "name": "WEBLATE_SITE_DOMAIN",
// "label": "Weblate Domain",
// "defaultValue": "$$generate_domain",
// "description": "",
// },
// {
// "id": "$$secret_weblate_admin_password",
// "main": "$$id",
// "name": "WEBLATE_ADMIN_PASSWORD",
// "label": "Weblate Admin Password",
// "defaultValue": "$$generate_password",
// "description": "",
// "extras": {
// "isVisibleOnUI": true,
// }
// },
// {
// "id": "$$secret_weblate_admin_password2",
// "name": "WEBLATE_ADMIN_PASSWORD2",
// "label": "Weblate Admin Password2",
// "defaultValue": "$$generate_password",
// "description": "",
// },
// {
// "id": "$$config_postgres_user",
// "main": "$$id-postgresql",
// "name": "POSTGRES_USER",
// "label": "PostgreSQL User",
// "defaultValue": "$$generate_username",
// "description": "",
// },
// {
// "id": "$$secret_postgres_password",
// "main": "$$id-postgresql",
// "name": "POSTGRES_PASSWORD",
// "label": "PostgreSQL Password",
// "defaultValue": "$$generate_password(32)",
// "description": "",
// },
// {
// "id": "$$secret_postgres_password_hex32",
// "name": "POSTGRES_PASSWORD_hex32",
// "label": "PostgreSQL Password hex32",
// "defaultValue": "$$generate_hex(32)",
// "description": "",
// },
// {
// "id": "$$config_postgres_something_hex32",
// "name": "POSTGRES_SOMETHING_HEX32",
// "label": "PostgreSQL Something hex32",
// "defaultValue": "$$generate_hex(32)",
// "description": "",
// },
// {
// "id": "$$config_postgres_db",
// "main": "$$id-postgresql",
// "name": "POSTGRES_DB",
// "label": "PostgreSQL Database",
// "defaultValue": "weblate",
// "description": "",
// },
// {
// "id": "$$secret_postgres_secret",
// "name": "POSTGRES_SECRET",
// "label": "PostgreSQL Secret",
// "defaultValue": "",
// "description": "",
// },
// ]
// })
// }
return templates
}
const compareSemanticVersions = (a: string, b: string) => {
const a1 = a.split('.');
const b1 = b.split('.');
const len = Math.min(a1.length, b1.length);
for (let i = 0; i < len; i++) {
const a2 = +a1[i] || 0;
const b2 = +b1[i] || 0;
if (a2 !== b2) {
return a2 > b2 ? 1 : -1;
}
}
return b1.length - a1.length;
};
export async function getTags(type: string) {
if (type) {
let tags: any = [];
if (isDev) {
tags = JSON.parse(await (await fs.readFile('./tags.json')).toString())
} else {
tags = JSON.parse(await (await fs.readFile('/app/tags.json')).toString())
}
tags = tags.find((tag: any) => tag.name.includes(type))
tags.tags = tags.tags.sort(compareSemanticVersions).reverse();
return tags
}
return []
}

View File

@@ -1,367 +1,9 @@
import cuid from 'cuid';
import { encrypt, generatePassword, prisma } from '../common';
export const includeServices: any = {
destinationDocker: true,
persistentStorage: true,
serviceSecret: true,
minio: true,
plausibleAnalytics: true,
vscodeserver: true,
wordpress: true,
ghost: true,
meiliSearch: true,
umami: true,
hasura: true,
fider: true,
moodle: true,
appwrite: true,
glitchTip: true,
searxng: true,
weblate: true,
taiga: true
};
export async function configureServiceType({
id,
type
}: {
id: string;
type: string;
}): Promise<void> {
if (type === 'plausibleanalytics') {
const password = encrypt(generatePassword({}));
const postgresqlUser = cuid();
const postgresqlPassword = encrypt(generatePassword({}));
const postgresqlDatabase = 'plausibleanalytics';
const secretKeyBase = encrypt(generatePassword({ length: 64 }));
await prisma.service.update({
where: { id },
data: {
type,
plausibleAnalytics: {
create: {
postgresqlDatabase,
postgresqlUser,
postgresqlPassword,
password,
secretKeyBase
}
}
}
});
} else if (type === 'nocodb') {
await prisma.service.update({
where: { id },
data: { type }
});
} else if (type === 'minio') {
const rootUser = cuid();
const rootUserPassword = encrypt(generatePassword({}));
await prisma.service.update({
where: { id },
data: { type, minio: { create: { rootUser, rootUserPassword } } }
});
} else if (type === 'vscodeserver') {
const password = encrypt(generatePassword({}));
await prisma.service.update({
where: { id },
data: { type, vscodeserver: { create: { password } } }
});
} else if (type === 'wordpress') {
const mysqlUser = cuid();
const mysqlPassword = encrypt(generatePassword({}));
const mysqlRootUser = cuid();
const mysqlRootUserPassword = encrypt(generatePassword({}));
await prisma.service.update({
where: { id },
data: {
type,
wordpress: { create: { mysqlPassword, mysqlRootUserPassword, mysqlRootUser, mysqlUser } }
}
});
} else if (type === 'vaultwarden') {
await prisma.service.update({
where: { id },
data: {
type
}
});
} else if (type === 'languagetool') {
await prisma.service.update({
where: { id },
data: {
type
}
});
} else if (type === 'n8n') {
await prisma.service.update({
where: { id },
data: {
type
}
});
} else if (type === 'uptimekuma') {
await prisma.service.update({
where: { id },
data: {
type
}
});
} else if (type === 'ghost') {
const defaultEmail = `${cuid()}@example.com`;
const defaultPassword = encrypt(generatePassword({}));
const mariadbUser = cuid();
const mariadbPassword = encrypt(generatePassword({}));
const mariadbRootUser = cuid();
const mariadbRootUserPassword = encrypt(generatePassword({}));
await prisma.service.update({
where: { id },
data: {
type,
ghost: {
create: {
defaultEmail,
defaultPassword,
mariadbUser,
mariadbPassword,
mariadbRootUser,
mariadbRootUserPassword
}
}
}
});
} else if (type === 'meilisearch') {
const masterKey = encrypt(generatePassword({ length: 32 }));
await prisma.service.update({
where: { id },
data: {
type,
meiliSearch: { create: { masterKey } }
}
});
} else if (type === 'umami') {
const umamiAdminPassword = encrypt(generatePassword({}));
const postgresqlUser = cuid();
const postgresqlPassword = encrypt(generatePassword({}));
const postgresqlDatabase = 'umami';
const hashSalt = encrypt(generatePassword({ length: 64 }));
await prisma.service.update({
where: { id },
data: {
type,
umami: {
create: {
umamiAdminPassword,
postgresqlDatabase,
postgresqlPassword,
postgresqlUser,
hashSalt
}
}
}
});
} else if (type === 'hasura') {
const postgresqlUser = cuid();
const postgresqlPassword = encrypt(generatePassword({}));
const postgresqlDatabase = 'hasura';
const graphQLAdminPassword = encrypt(generatePassword({}));
await prisma.service.update({
where: { id },
data: {
type,
hasura: {
create: {
postgresqlDatabase,
postgresqlPassword,
postgresqlUser,
graphQLAdminPassword
}
}
}
});
} else if (type === 'fider') {
const postgresqlUser = cuid();
const postgresqlPassword = encrypt(generatePassword({}));
const postgresqlDatabase = 'fider';
const jwtSecret = encrypt(generatePassword({ length: 64, symbols: true }));
await prisma.service.update({
where: { id },
data: {
type,
fider: {
create: {
postgresqlDatabase,
postgresqlPassword,
postgresqlUser,
jwtSecret
}
}
}
});
} else if (type === 'moodle') {
const defaultUsername = cuid();
const defaultPassword = encrypt(generatePassword({}));
const defaultEmail = `${cuid()} @example.com`;
const mariadbUser = cuid();
const mariadbPassword = encrypt(generatePassword({}));
const mariadbDatabase = 'moodle_db';
const mariadbRootUser = cuid();
const mariadbRootUserPassword = encrypt(generatePassword({}));
await prisma.service.update({
where: { id },
data: {
type,
moodle: {
create: {
defaultUsername,
defaultPassword,
defaultEmail,
mariadbUser,
mariadbPassword,
mariadbDatabase,
mariadbRootUser,
mariadbRootUserPassword
}
}
}
});
} else if (type === 'appwrite') {
const opensslKeyV1 = encrypt(generatePassword({}));
const executorSecret = encrypt(generatePassword({}));
const redisPassword = encrypt(generatePassword({}));
const mariadbHost = `${id}-mariadb`
const mariadbUser = cuid();
const mariadbPassword = encrypt(generatePassword({}));
const mariadbDatabase = 'appwrite';
const mariadbRootUser = cuid();
const mariadbRootUserPassword = encrypt(generatePassword({}));
await prisma.service.update({
where: { id },
data: {
type,
appwrite: {
create: {
opensslKeyV1,
executorSecret,
redisPassword,
mariadbHost,
mariadbUser,
mariadbPassword,
mariadbDatabase,
mariadbRootUser,
mariadbRootUserPassword
}
}
}
});
} else if (type === 'glitchTip') {
const defaultUsername = cuid();
const defaultEmail = `${defaultUsername}@example.com`;
const defaultPassword = encrypt(generatePassword({}));
const postgresqlUser = cuid();
const postgresqlPassword = encrypt(generatePassword({}));
const postgresqlDatabase = 'glitchTip';
const secretKeyBase = encrypt(generatePassword({ length: 64 }));
await prisma.service.update({
where: { id },
data: {
type,
glitchTip: {
create: {
postgresqlDatabase,
postgresqlUser,
postgresqlPassword,
secretKeyBase,
defaultEmail,
defaultUsername,
defaultPassword,
}
}
}
});
} else if (type === 'searxng') {
const secretKey = encrypt(generatePassword({ length: 32, isHex: true }))
const redisPassword = encrypt(generatePassword({}));
await prisma.service.update({
where: { id },
data: {
type,
searxng: {
create: {
secretKey,
redisPassword,
}
}
}
});
} else if (type === 'weblate') {
const adminPassword = encrypt(generatePassword({}))
const postgresqlUser = cuid();
const postgresqlPassword = encrypt(generatePassword({}));
const postgresqlDatabase = 'weblate';
await prisma.service.update({
where: { id },
data: {
type,
weblate: {
create: {
adminPassword,
postgresqlHost: `${id}-postgresql`,
postgresqlPort: 5432,
postgresqlUser,
postgresqlPassword,
postgresqlDatabase,
}
}
}
});
} else if (type === 'taiga') {
const secretKey = encrypt(generatePassword({}))
const erlangSecret = encrypt(generatePassword({}))
const rabbitMQUser = cuid();
const djangoAdminUser = cuid();
const djangoAdminPassword = encrypt(generatePassword({}))
const rabbitMQPassword = encrypt(generatePassword({}))
const postgresqlUser = cuid();
const postgresqlPassword = encrypt(generatePassword({}));
const postgresqlDatabase = 'taiga';
await prisma.service.update({
where: { id },
data: {
type,
taiga: {
create: {
secretKey,
erlangSecret,
djangoAdminUser,
djangoAdminPassword,
rabbitMQUser,
rabbitMQPassword,
postgresqlHost: `${id}-postgresql`,
postgresqlPort: 5432,
postgresqlUser,
postgresqlPassword,
postgresqlDatabase,
}
}
}
});
} else {
await prisma.service.update({
where: { id },
data: {
type
}
});
}
}
import { prisma } from '../common';
export async function removeService({ id }: { id: string }): Promise<void> {
await prisma.serviceSecret.deleteMany({ where: { serviceId: id } });
await prisma.serviceSetting.deleteMany({ where: { serviceId: id } });
await prisma.servicePersistentStorage.deleteMany({ where: { serviceId: id } });
await prisma.meiliSearch.deleteMany({ where: { serviceId: id } });
await prisma.fider.deleteMany({ where: { serviceId: id } });
@@ -378,6 +20,6 @@ export async function removeService({ id }: { id: string }): Promise<void> {
await prisma.searxng.deleteMany({ where: { serviceId: id } });
await prisma.weblate.deleteMany({ where: { serviceId: id } });
await prisma.taiga.deleteMany({ where: { serviceId: id } });
await prisma.service.delete({ where: { id } });
}

File diff suppressed because it is too large Load Diff

View File

@@ -1,215 +0,0 @@
export const supportedServiceTypesAndVersions = [
{
name: 'plausibleanalytics',
fancyName: 'Plausible Analytics',
baseImage: 'plausible/analytics',
images: ['bitnami/postgresql:13.2.0', 'yandex/clickhouse-server:21.3.2.5'],
versions: ['latest', 'stable'],
recommendedVersion: 'stable',
ports: {
main: 8000
}
},
{
name: 'nocodb',
fancyName: 'NocoDB',
baseImage: 'nocodb/nocodb',
versions: ['latest'],
recommendedVersion: 'latest',
ports: {
main: 8080
}
},
{
name: 'minio',
fancyName: 'MinIO',
baseImage: 'minio/minio',
versions: ['latest'],
recommendedVersion: 'latest',
ports: {
main: 9001
}
},
{
name: 'vscodeserver',
fancyName: 'VSCode Server',
baseImage: 'codercom/code-server',
versions: ['latest'],
recommendedVersion: 'latest',
ports: {
main: 8080
}
},
{
name: 'wordpress',
fancyName: 'Wordpress',
baseImage: 'wordpress',
images: ['bitnami/mysql:5.7'],
versions: ['latest', 'php8.1', 'php8.0', 'php7.4', 'php7.3'],
recommendedVersion: 'latest',
ports: {
main: 80
}
},
{
name: 'vaultwarden',
fancyName: 'Vaultwarden',
baseImage: 'vaultwarden/server',
versions: ['latest'],
recommendedVersion: 'latest',
ports: {
main: 80
}
},
{
name: 'languagetool',
fancyName: 'LanguageTool',
baseImage: 'silviof/docker-languagetool',
versions: ['latest'],
recommendedVersion: 'latest',
ports: {
main: 8010
}
},
{
name: 'n8n',
fancyName: 'n8n',
baseImage: 'n8nio/n8n',
versions: ['latest'],
recommendedVersion: 'latest',
ports: {
main: 5678
}
},
{
name: 'uptimekuma',
fancyName: 'Uptime Kuma',
baseImage: 'louislam/uptime-kuma',
versions: ['latest'],
recommendedVersion: 'latest',
ports: {
main: 3001
}
},
{
name: 'ghost',
fancyName: 'Ghost',
baseImage: 'bitnami/ghost',
images: ['bitnami/mariadb'],
versions: ['latest'],
recommendedVersion: 'latest',
ports: {
main: 2368
}
},
{
name: 'meilisearch',
fancyName: 'Meilisearch',
baseImage: 'getmeili/meilisearch',
images: [],
versions: ['latest'],
recommendedVersion: 'latest',
ports: {
main: 7700
}
},
{
name: 'umami',
fancyName: 'Umami',
baseImage: 'ghcr.io/mikecao/umami',
images: ['postgres:12-alpine'],
versions: ['postgresql-latest'],
recommendedVersion: 'postgresql-latest',
ports: {
main: 3000
}
},
{
name: 'hasura',
fancyName: 'Hasura',
baseImage: 'hasura/graphql-engine',
images: ['postgres:12-alpine'],
versions: ['latest', 'v2.10.0', 'v2.5.1'],
recommendedVersion: 'v2.10.0',
ports: {
main: 8080
}
},
{
name: 'fider',
fancyName: 'Fider',
baseImage: 'getfider/fider',
images: ['postgres:12-alpine'],
versions: ['stable'],
recommendedVersion: 'stable',
ports: {
main: 3000
}
},
{
name: 'appwrite',
fancyName: 'Appwrite',
baseImage: 'appwrite/appwrite',
images: ['mariadb:10.7', 'redis:6.2-alpine', 'appwrite/telegraf:1.4.0'],
versions: ['latest', '0.15.3'],
recommendedVersion: '0.15.3',
ports: {
main: 80
}
},
// {
// name: 'moodle',
// fancyName: 'Moodle',
// baseImage: 'bitnami/moodle',
// images: [],
// versions: ['latest', 'v4.0.2'],
// recommendedVersion: 'latest',
// ports: {
// main: 8080
// }
// }
{
name: 'glitchTip',
fancyName: 'GlitchTip',
baseImage: 'glitchtip/glitchtip',
images: ['postgres:14-alpine', 'redis:7-alpine'],
versions: ['latest'],
recommendedVersion: 'latest',
ports: {
main: 8000
}
},
{
name: 'searxng',
fancyName: 'SearXNG',
baseImage: 'searxng/searxng',
images: [],
versions: ['latest'],
recommendedVersion: 'latest',
ports: {
main: 8080
}
},
{
name: 'weblate',
fancyName: 'Weblate',
baseImage: 'weblate/weblate',
images: ['postgres:14-alpine', 'redis:6-alpine'],
versions: ['latest'],
recommendedVersion: 'latest',
ports: {
main: 8080
}
},
// {
// name: 'taiga',
// fancyName: 'Taiga',
// baseImage: 'taigaio/taiga-front',
// images: ['postgres:12.3', 'rabbitmq:3.8-management-alpine', 'taigaio/taiga-back', 'taigaio/taiga-events', 'taigaio/taiga-protected'],
// versions: ['latest'],
// recommendedVersion: 'latest',
// ports: {
// main: 80
// }
// },
];

View File

@@ -0,0 +1,29 @@
export default async (fastify) => {
fastify.io.use((socket, next) => {
const { token } = socket.handshake.auth;
if (token && fastify.jwt.verify(token)) {
next();
} else {
return next(new Error("unauthorized event"));
}
});
fastify.io.on('connection', (socket: any) => {
const { token } = socket.handshake.auth;
const { teamId } = fastify.jwt.decode(token);
socket.join(teamId);
// console.info('Socket connected!', socket.id)
// console.info('Socket joined team!', teamId)
// socket.on('message', (message) => {
// console.log(message)
// })
// socket.on('error', (err) => {
// console.log(err)
// })
})
// fastify.io.on("error", (err) => {
// if (err && err.message === "unauthorized event") {
// fastify.io.disconnect();
// }
// });
}

View File

@@ -1,18 +1,18 @@
import cuid from 'cuid';
import crypto from 'node:crypto'
import jsonwebtoken from 'jsonwebtoken';
import axios from 'axios';
import { FastifyReply } from 'fastify';
import fs from 'fs/promises';
import yaml from 'js-yaml';
import csv from 'csvtojson';
import { day } from '../../../../lib/dayjs';
import { makeLabelForStandaloneApplication, setDefaultBaseImage, setDefaultConfiguration } from '../../../../lib/buildPacks/common';
import { checkDomainsIsValidInDNS, checkDoubleBranch, checkExposedPort, createDirectories, decrypt, defaultComposeConfiguration, encrypt, errorHandler, executeDockerCmd, generateSshKeyPair, getContainerUsage, getDomain, isDev, isDomainConfigured, listSettings, prisma, stopBuild, uniqueName } from '../../../../lib/common';
import { checkContainer, formatLabelsOnDocker, isContainerExited, removeContainer } from '../../../../lib/docker';
import { setDefaultBaseImage, setDefaultConfiguration } from '../../../../lib/buildPacks/common';
import { checkDomainsIsValidInDNS, checkExposedPort, createDirectories, decrypt, defaultComposeConfiguration, encrypt, errorHandler, executeDockerCmd, generateSshKeyPair, getContainerUsage, getDomain, isDev, isDomainConfigured, listSettings, prisma, stopBuild, uniqueName } from '../../../../lib/common';
import { checkContainer, formatLabelsOnDocker, removeContainer } from '../../../../lib/docker';
import type { FastifyRequest } from 'fastify';
import type { GetImages, CancelDeployment, CheckDNS, CheckRepository, DeleteApplication, DeleteSecret, DeleteStorage, GetApplicationLogs, GetBuildIdLogs, GetBuildLogs, SaveApplication, SaveApplicationSettings, SaveApplicationSource, SaveDeployKey, SaveDestination, SaveSecret, SaveStorage, DeployApplication, CheckDomain, StopPreviewApplication } from './types';
import type { GetImages, CancelDeployment, CheckDNS, CheckRepository, DeleteApplication, DeleteSecret, DeleteStorage, GetApplicationLogs, GetBuildIdLogs, SaveApplication, SaveApplicationSettings, SaveApplicationSource, SaveDeployKey, SaveDestination, SaveSecret, SaveStorage, DeployApplication, CheckDomain, StopPreviewApplication, RestartPreviewApplication, GetBuilds } from './types';
import { OnlyId } from '../../../../types';
function filterObject(obj, callback) {
@@ -68,22 +68,105 @@ export async function getImages(request: FastifyRequest<GetImages>) {
return errorHandler({ status, message })
}
}
export async function cleanupUnconfiguredApplications(request: FastifyRequest<any>) {
try {
const teamId = request.user.teamId
let applications = await prisma.application.findMany({
where: { teams: { some: { id: teamId === "0" ? undefined : teamId } } },
include: { settings: true, destinationDocker: true, teams: true },
});
for (const application of applications) {
if (!application.buildPack || !application.destinationDockerId || !application.branch || (!application.settings?.isBot && !application?.fqdn)) {
if (application?.destinationDockerId && application.destinationDocker?.network) {
const { stdout: containers } = await executeDockerCmd({
dockerId: application.destinationDocker.id,
command: `docker ps -a --filter network=${application.destinationDocker.network} --filter name=${application.id} --format '{{json .}}'`
})
if (containers) {
const containersArray = containers.trim().split('\n');
for (const container of containersArray) {
const containerObj = JSON.parse(container);
const id = containerObj.ID;
await removeContainer({ id, dockerId: application.destinationDocker.id });
}
}
}
await prisma.applicationSettings.deleteMany({ where: { applicationId: application.id } });
await prisma.buildLog.deleteMany({ where: { applicationId: application.id } });
await prisma.build.deleteMany({ where: { applicationId: application.id } });
await prisma.secret.deleteMany({ where: { applicationId: application.id } });
await prisma.applicationPersistentStorage.deleteMany({ where: { applicationId: application.id } });
await prisma.applicationConnectedDatabase.deleteMany({ where: { applicationId: application.id } });
await prisma.application.deleteMany({ where: { id: application.id } });
}
}
return {}
} catch ({ status, message }) {
return errorHandler({ status, message })
}
}
export async function getApplicationStatus(request: FastifyRequest<OnlyId>) {
try {
const { id } = request.params
const { teamId } = request.user
let isRunning = false;
let isExited = false;
let payload = []
const application: any = await getApplicationFromDB(id, teamId);
if (application?.destinationDockerId) {
isRunning = await checkContainer({ dockerId: application.destinationDocker.id, container: id });
isExited = await isContainerExited(application.destinationDocker.id, id);
if (application.buildPack === 'compose') {
const { stdout: containers } = await executeDockerCmd({
dockerId: application.destinationDocker.id,
command:
`docker ps -a --filter "label=coolify.applicationId=${id}" --format '{{json .}}'`
});
const containersArray = containers.trim().split('\n');
if (containersArray.length > 0 && containersArray[0] !== '') {
for (const container of containersArray) {
let isRunning = false;
let isExited = false;
let isRestarting = false;
const containerObj = JSON.parse(container);
const status = containerObj.State
if (status === 'running') {
isRunning = true;
}
if (status === 'exited') {
isExited = true;
}
if (status === 'restarting') {
isRestarting = true;
}
payload.push({
name: containerObj.Names,
status: {
isRunning,
isExited,
isRestarting
}
})
}
}
} else {
let isRunning = false;
let isExited = false;
let isRestarting = false;
const status = await checkContainer({ dockerId: application.destinationDocker.id, container: id });
if (status?.found) {
isRunning = status.status.isRunning;
isExited = status.status.isExited;
isRestarting = status.status.isRestarting
payload.push({
name: id,
status: {
isRunning,
isExited,
isRestarting
}
})
}
}
}
return {
isRunning,
isExited,
};
return payload
} catch ({ status, message }) {
return errorHandler({ status, message })
}
@@ -157,7 +240,8 @@ export async function getApplicationFromDB(id: string, teamId: string) {
gitSource: { include: { githubApp: true, gitlabApp: true } },
secrets: true,
persistentStorage: true,
connectedDatabase: true
connectedDatabase: true,
previewApplication: true
}
});
if (!application) {
@@ -245,13 +329,15 @@ export async function saveApplication(request: FastifyRequest<SaveApplication>,
baseImage,
baseBuildImage,
deploymentType,
baseDatabaseBranch
baseDatabaseBranch,
dockerComposeFile,
dockerComposeFileLocation,
dockerComposeConfiguration
} = request.body
if (port) port = Number(port);
if (exposePort) {
exposePort = Number(exposePort);
}
const { destinationDocker: { engine, remoteEngine, remoteIpAddress }, exposePort: configuredPort } = await prisma.application.findUnique({ where: { id }, include: { destinationDocker: true } })
if (exposePort) await checkExposedPort({ id, configuredPort, exposePort, engine, remoteEngine, remoteIpAddress })
if (denoOptions) denoOptions = denoOptions.trim();
@@ -280,6 +366,9 @@ export async function saveApplication(request: FastifyRequest<SaveApplication>,
baseImage,
baseBuildImage,
deploymentType,
dockerComposeFile,
dockerComposeFileLocation,
dockerComposeConfiguration,
...defaultConfiguration,
connectedDatabase: { update: { hostedDatabaseDBName: baseDatabaseBranch } }
}
@@ -298,6 +387,9 @@ export async function saveApplication(request: FastifyRequest<SaveApplication>,
baseImage,
baseBuildImage,
deploymentType,
dockerComposeFile,
dockerComposeFileLocation,
dockerComposeConfiguration,
...defaultConfiguration
}
});
@@ -313,15 +405,10 @@ export async function saveApplication(request: FastifyRequest<SaveApplication>,
export async function saveApplicationSettings(request: FastifyRequest<SaveApplicationSettings>, reply: FastifyReply) {
try {
const { id } = request.params
const { debug, previews, dualCerts, autodeploy, branch, projectId, isBot, isDBBranching } = request.body
// const isDouble = await checkDoubleBranch(branch, projectId);
// if (isDouble && autodeploy) {
// await prisma.applicationSettings.updateMany({ where: { application: { branch, projectId } }, data: { autodeploy: false } })
// throw { status: 500, message: 'Cannot activate automatic deployments until only one application is defined for this repository / branch.' }
// }
const { debug, previews, dualCerts, autodeploy, branch, projectId, isBot, isDBBranching, isCustomSSL } = request.body
await prisma.application.update({
where: { id },
data: { fqdn: isBot ? null : undefined, settings: { update: { debug, previews, dualCerts, autodeploy, isBot, isDBBranching } } },
data: { fqdn: isBot ? null : undefined, settings: { update: { debug, previews, dualCerts, autodeploy, isBot, isDBBranching, isCustomSSL } } },
include: { destinationDocker: true }
});
return reply.code(201).send();
@@ -339,10 +426,11 @@ export async function stopPreviewApplication(request: FastifyRequest<StopPreview
if (application?.destinationDockerId) {
const container = `${id}-${pullmergeRequestId}`
const { id: dockerId } = application.destinationDocker;
const found = await checkContainer({ dockerId, container });
const { found } = await checkContainer({ dockerId, container });
if (found) {
await removeContainer({ id: container, dockerId: application.destinationDocker.id });
}
await prisma.previewApplication.deleteMany({ where: { applicationId: application.id, pullmergeRequestId } })
}
return reply.code(201).send();
} catch ({ status, message }) {
@@ -366,7 +454,10 @@ export async function restartApplication(request: FastifyRequest<OnlyId>, reply:
if (secrets.length > 0) {
secrets.forEach((secret) => {
if (pullmergeRequestId) {
if (secret.isPRMRSecret) {
const isSecretFound = secrets.filter(s => s.name === secret.name && s.isPRMRSecret)
if (isSecretFound.length > 0) {
envs.push(`${secret.name}=${isSecretFound[0].value}`);
} else {
envs.push(`${secret.name}=${secret.value}`);
}
} else {
@@ -463,7 +554,22 @@ export async function stopApplication(request: FastifyRequest<OnlyId>, reply: Fa
const application: any = await getApplicationFromDB(id, teamId);
if (application?.destinationDockerId) {
const { id: dockerId } = application.destinationDocker;
const found = await checkContainer({ dockerId, container: id });
if (application.buildPack === 'compose') {
const { stdout: containers } = await executeDockerCmd({
dockerId: application.destinationDocker.id,
command:
`docker ps -a --filter "label=coolify.applicationId=${id}" --format '{{json .}}'`
});
const containersArray = containers.trim().split('\n');
if (containersArray.length > 0 && containersArray[0] !== '') {
for (const container of containersArray) {
const containerObj = JSON.parse(container);
await removeContainer({ id: containerObj.ID, dockerId: application.destinationDocker.id });
}
}
return
}
const { found } = await checkContainer({ dockerId, container: id });
if (found) {
await removeContainer({ id, dockerId: application.destinationDocker.id });
}
@@ -570,6 +676,24 @@ export async function getUsage(request) {
return errorHandler({ status, message })
}
}
export async function getUsageByContainer(request) {
try {
const { id, containerId } = request.params
const teamId = request.user?.teamId;
let usage = {};
const application: any = await getApplicationFromDB(id, teamId);
if (application.destinationDockerId) {
[usage] = await Promise.all([getContainerUsage(application.destinationDocker.id, containerId)]);
}
return {
usage
}
} catch ({ status, message }) {
return errorHandler({ status, message })
}
}
export async function deployApplication(request: FastifyRequest<DeployApplication>) {
try {
const { id } = request.params
@@ -607,7 +731,7 @@ export async function deployApplication(request: FastifyRequest<DeployApplicatio
githubAppId: application.gitSource?.githubApp?.id,
gitlabAppId: application.gitSource?.gitlabApp?.id,
status: 'queued',
type: 'manual'
type: pullmergeRequestId ? application.gitSource?.githubApp?.id ? 'manual_pr' : 'manual_mr' : 'manual'
}
});
return {
@@ -646,6 +770,7 @@ export async function saveApplicationSource(request: FastifyRequest<SaveApplicat
export async function getGitHubToken(request: FastifyRequest<OnlyId>, reply: FastifyReply) {
try {
const { default: got } = await import('got')
const { id } = request.params
const { teamId } = request.user
const application: any = await getApplicationFromDB(id, teamId);
@@ -657,13 +782,13 @@ export async function getGitHubToken(request: FastifyRequest<OnlyId>, reply: Fas
const githubToken = jsonwebtoken.sign(payload, application.gitSource.githubApp.privateKey, {
algorithm: 'RS256'
});
const { data } = await axios.post(`${application.gitSource.apiUrl}/app/installations/${application.gitSource.githubApp.installationId}/access_tokens`, {}, {
const { token } = await got.post(`${application.gitSource.apiUrl}/app/installations/${application.gitSource.githubApp.installationId}/access_tokens`, {
headers: {
Authorization: `Bearer ${githubToken}`
'Authorization': `Bearer ${githubToken}`,
}
})
}).json()
return reply.code(201).send({
token: data.token
token
})
} catch ({ status, message }) {
return errorHandler({ status, message })
@@ -694,7 +819,7 @@ export async function saveRepository(request, reply) {
let { repository, branch, projectId, autodeploy, webhookToken, isPublicRepository = false } = request.body
repository = repository.toLowerCase();
branch = branch.toLowerCase();
projectId = Number(projectId);
if (webhookToken) {
await prisma.application.update({
@@ -755,7 +880,10 @@ export async function saveBuildPack(request, reply) {
try {
const { id } = request.params
const { buildPack } = request.body
await prisma.application.update({ where: { id }, data: { buildPack } });
const { baseImage, baseBuildImage } = setDefaultBaseImage(
buildPack
);
await prisma.application.update({ where: { id }, data: { buildPack, baseImage, baseBuildImage } });
return reply.code(201).send()
} catch ({ status, message }) {
return errorHandler({ status, message })
@@ -775,55 +903,83 @@ export async function saveConnectedDatabase(request, reply) {
export async function getSecrets(request: FastifyRequest<OnlyId>) {
try {
const { id } = request.params
let secrets = await prisma.secret.findMany({
where: { applicationId: id },
orderBy: { createdAt: 'desc' }
where: { applicationId: id, isPRMRSecret: false },
orderBy: { createdAt: 'asc' }
});
let previewSecrets = await prisma.secret.findMany({
where: { applicationId: id, isPRMRSecret: true },
orderBy: { createdAt: 'asc' }
});
secrets = secrets.map((secret) => {
secret.value = decrypt(secret.value);
return secret;
});
secrets = secrets.filter((secret) => !secret.isPRMRSecret).sort((a, b) => {
return ('' + a.name).localeCompare(b.name);
})
previewSecrets = previewSecrets.map((secret) => {
secret.value = decrypt(secret.value);
return secret;
});
return {
secrets
previewSecrets: previewSecrets.sort((a, b) => {
return ('' + a.name).localeCompare(b.name);
}),
secrets: secrets.sort((a, b) => {
return ('' + a.name).localeCompare(b.name);
})
}
} catch ({ status, message }) {
return errorHandler({ status, message })
}
}
export async function updatePreviewSecret(request: FastifyRequest<SaveSecret>, reply: FastifyReply) {
try {
const { id } = request.params
let { name, value } = request.body
if (value) {
value = encrypt(value.trim())
} else {
value = ''
}
await prisma.secret.updateMany({
where: { applicationId: id, name, isPRMRSecret: true },
data: { value }
});
return reply.code(201).send()
} catch ({ status, message }) {
return errorHandler({ status, message })
}
}
export async function updateSecret(request: FastifyRequest<SaveSecret>, reply: FastifyReply) {
try {
const { id } = request.params
const { name, value, isBuildSecret = undefined } = request.body
await prisma.secret.updateMany({
where: { applicationId: id, name },
data: { value: encrypt(value.trim()), isBuildSecret }
});
return reply.code(201).send()
} catch ({ status, message }) {
return errorHandler({ status, message })
}
}
export async function saveSecret(request: FastifyRequest<SaveSecret>, reply: FastifyReply) {
try {
const { id } = request.params
let { name, value, isBuildSecret, isPRMRSecret, isNew } = request.body
if (isNew) {
const found = await prisma.secret.findFirst({ where: { name, applicationId: id, isPRMRSecret } });
if (found) {
throw { status: 500, message: `Secret ${name} already exists.` }
} else {
value = encrypt(value.trim());
await prisma.secret.create({
data: { name, value, isBuildSecret, isPRMRSecret, application: { connect: { id } } }
});
}
} else {
value = encrypt(value.trim());
const found = await prisma.secret.findFirst({ where: { applicationId: id, name, isPRMRSecret } });
if (found) {
await prisma.secret.updateMany({
where: { applicationId: id, name, isPRMRSecret },
data: { value, isBuildSecret, isPRMRSecret }
});
} else {
await prisma.secret.create({
data: { name, value, isBuildSecret, isPRMRSecret, application: { connect: { id } } }
});
}
const { name, value, isBuildSecret = false } = request.body
const found = await prisma.secret.findMany({ where: { applicationId: id, name } })
if (found.length > 0) {
throw ({ message: 'Secret already exists.' })
}
await prisma.secret.create({
data: { name, value: encrypt(value.trim()), isBuildSecret, isPRMRSecret: false, application: { connect: { id } } }
});
await prisma.secret.create({
data: { name, value: encrypt(value.trim()), isBuildSecret, isPRMRSecret: true, application: { connect: { id } } }
});
return reply.code(201).send()
} catch ({ status, message }) {
return errorHandler({ status, message })
@@ -884,6 +1040,181 @@ export async function deleteStorage(request: FastifyRequest<DeleteStorage>) {
}
}
export async function restartPreview(request: FastifyRequest<RestartPreviewApplication>, reply: FastifyReply) {
try {
const { id, pullmergeRequestId } = request.params
const { teamId } = request.user
let application: any = await getApplicationFromDB(id, teamId);
if (application?.destinationDockerId) {
const buildId = cuid();
const { id: dockerId, network } = application.destinationDocker;
const { secrets, port, repository, persistentStorage, id: applicationId, buildPack, exposePort } = application;
const envs = [
`PORT=${port}`
];
if (secrets.length > 0) {
secrets.forEach((secret) => {
if (pullmergeRequestId) {
const isSecretFound = secrets.filter(s => s.name === secret.name && s.isPRMRSecret)
if (isSecretFound.length > 0) {
envs.push(`${secret.name}=${isSecretFound[0].value}`);
} else {
envs.push(`${secret.name}=${secret.value}`);
}
} else {
if (!secret.isPRMRSecret) {
envs.push(`${secret.name}=${secret.value}`);
}
}
});
}
const { workdir } = await createDirectories({ repository, buildId });
const labels = []
let image = null
const { stdout: container } = await executeDockerCmd({ dockerId, command: `docker container ls --filter 'label=com.docker.compose.service=${id}-${pullmergeRequestId}' --format '{{json .}}'` })
const containersArray = container.trim().split('\n');
for (const container of containersArray) {
const containerObj = formatLabelsOnDocker(container);
image = containerObj[0].Image
Object.keys(containerObj[0].Labels).forEach(function (key) {
if (key.startsWith('coolify')) {
labels.push(`${key}=${containerObj[0].Labels[key]}`)
}
})
}
let imageFound = false;
try {
await executeDockerCmd({
dockerId,
command: `docker image inspect ${image}`
})
imageFound = true;
} catch (error) {
//
}
if (!imageFound) {
throw { status: 500, message: 'Image not found, cannot restart application.' }
}
await fs.writeFile(`${workdir}/.env`, envs.join('\n'));
let envFound = false;
try {
envFound = !!(await fs.stat(`${workdir}/.env`));
} catch (error) {
//
}
const volumes =
persistentStorage?.map((storage) => {
return `${applicationId}${storage.path.replace(/\//gi, '-')}:${buildPack !== 'docker' ? '/app' : ''
}${storage.path}`;
}) || [];
const composeVolumes = volumes.map((volume) => {
return {
[`${volume.split(':')[0]}`]: {
name: volume.split(':')[0]
}
};
});
const composeFile = {
version: '3.8',
services: {
[`${applicationId}-${pullmergeRequestId}`]: {
image,
container_name: `${applicationId}-${pullmergeRequestId}`,
volumes,
env_file: envFound ? [`${workdir}/.env`] : [],
labels,
depends_on: [],
expose: [port],
...(exposePort ? { ports: [`${exposePort}:${port}`] } : {}),
...defaultComposeConfiguration(network),
}
},
networks: {
[network]: {
external: true
}
},
volumes: Object.assign({}, ...composeVolumes)
};
await fs.writeFile(`${workdir}/docker-compose.yml`, yaml.dump(composeFile));
await executeDockerCmd({ dockerId, command: `docker stop -t 0 ${id}-${pullmergeRequestId}` })
await executeDockerCmd({ dockerId, command: `docker rm ${id}-${pullmergeRequestId}` })
await executeDockerCmd({ dockerId, command: `docker compose --project-directory ${workdir} up -d` })
return reply.code(201).send();
}
throw { status: 500, message: 'Application cannot be restarted.' }
} catch ({ status, message }) {
return errorHandler({ status, message })
}
}
export async function getPreviewStatus(request: FastifyRequest<RestartPreviewApplication>) {
try {
const { id, pullmergeRequestId } = request.params
const { teamId } = request.user
let isRunning = false;
let isExited = false;
let isRestarting = false;
let isBuilding = false
const application: any = await getApplicationFromDB(id, teamId);
if (application?.destinationDockerId) {
const status = await checkContainer({ dockerId: application.destinationDocker.id, container: `${id}-${pullmergeRequestId}` });
if (status?.found) {
isRunning = status.status.isRunning;
isExited = status.status.isExited;
isRestarting = status.status.isRestarting
}
const building = await prisma.build.findMany({ where: { applicationId: id, pullmergeRequestId, status: { in: ['queued', 'running'] } } })
isBuilding = building.length > 0
}
return {
isBuilding,
isRunning,
isRestarting,
isExited,
};
} catch ({ status, message }) {
return errorHandler({ status, message })
}
}
export async function loadPreviews(request: FastifyRequest<OnlyId>) {
try {
const { id } = request.params
const application = await prisma.application.findUnique({ where: { id }, include: { destinationDocker: true } });
const { stdout } = await executeDockerCmd({ dockerId: application.destinationDocker.id, command: `docker container ls --filter 'name=${id}-' --format "{{json .}}"` })
if (stdout === '') {
throw { status: 500, message: 'No previews found.' }
}
const containers = formatLabelsOnDocker(stdout).filter(container => container.Labels['coolify.configuration'] && container.Labels['coolify.type'] === 'standalone-application')
const jsonContainers = containers
.map((container) =>
JSON.parse(Buffer.from(container.Labels['coolify.configuration'], 'base64').toString())
)
.filter((container) => {
return container.pullmergeRequestId && container.applicationId === id;
});
for (const container of jsonContainers) {
const found = await prisma.previewApplication.findMany({ where: { applicationId: container.applicationId, pullmergeRequestId: container.pullmergeRequestId } })
if (found.length === 0) {
await prisma.previewApplication.create({
data: {
pullmergeRequestId: container.pullmergeRequestId,
sourceBranch: container.branch,
customDomain: container.fqdn,
application: { connect: { id: container.applicationId } }
}
})
}
}
return {
previews: await prisma.previewApplication.findMany({ where: { applicationId: id } })
}
} catch ({ status, message }) {
return errorHandler({ status, message })
}
}
export async function getPreviews(request: FastifyRequest<OnlyId>) {
try {
const { id } = request.params
@@ -899,26 +1230,7 @@ export async function getPreviews(request: FastifyRequest<OnlyId>) {
const applicationSecrets = secrets.filter((secret) => !secret.isPRMRSecret);
const PRMRSecrets = secrets.filter((secret) => secret.isPRMRSecret);
const application = await prisma.application.findUnique({ where: { id }, include: { destinationDocker: true } });
const { stdout } = await executeDockerCmd({ dockerId: application.destinationDocker.id, command: `docker container ls --filter 'name=${id}-' --format "{{json .}}"` })
if (stdout === '') {
return {
containers: [],
applicationSecrets: [],
PRMRSecrets: []
}
}
const containers = formatLabelsOnDocker(stdout).filter(container => container.Labels['coolify.configuration'] && container.Labels['coolify.type'] === 'standalone-application')
const jsonContainers = containers
.map((container) =>
JSON.parse(Buffer.from(container.Labels['coolify.configuration'], 'base64').toString())
)
.filter((container) => {
return container.pullmergeRequestId && container.applicationId === id;
});
return {
containers: jsonContainers,
applicationSecrets: applicationSecrets.sort((a, b) => {
return ('' + a.name).localeCompare(b.name);
}),
@@ -933,7 +1245,7 @@ export async function getPreviews(request: FastifyRequest<OnlyId>) {
export async function getApplicationLogs(request: FastifyRequest<GetApplicationLogs>) {
try {
const { id } = request.params;
const { id, containerId } = request.params;
let { since = 0 } = request.query
if (since !== 0) {
since = day(since).unix();
@@ -944,10 +1256,8 @@ export async function getApplicationLogs(request: FastifyRequest<GetApplicationL
});
if (destinationDockerId) {
try {
// const found = await checkContainer({ dockerId, container: id })
// if (found) {
const { default: ansi } = await import('strip-ansi')
const { stdout, stderr } = await executeDockerCmd({ dockerId, command: `docker logs --since ${since} --tail 5000 --timestamps ${id}` })
const { stdout, stderr } = await executeDockerCmd({ dockerId, command: `docker logs --since ${since} --tail 5000 --timestamps ${containerId}` })
const stripLogsStdout = stdout.toString().split('\n').map((l) => ansi(l)).filter((a) => a);
const stripLogsStderr = stderr.toString().split('\n').map((l) => ansi(l)).filter((a) => a);
const logs = stripLogsStderr.concat(stripLogsStdout)
@@ -955,7 +1265,10 @@ export async function getApplicationLogs(request: FastifyRequest<GetApplicationL
return { logs: sortedLogs }
// }
} catch (error) {
const { statusCode } = error;
const { statusCode, stderr } = error;
if (stderr.startsWith('Error: No such container')) {
return { logs: [], noContainer: true }
}
if (statusCode === 404) {
return {
logs: []
@@ -970,7 +1283,7 @@ export async function getApplicationLogs(request: FastifyRequest<GetApplicationL
return errorHandler({ status, message })
}
}
export async function getBuildLogs(request: FastifyRequest<GetBuildLogs>) {
export async function getBuilds(request: FastifyRequest<GetBuilds>) {
try {
const { id } = request.params
let { buildId, skip = 0 } = request.query
@@ -987,17 +1300,15 @@ export async function getBuildLogs(request: FastifyRequest<GetBuildLogs>) {
builds = await prisma.build.findMany({
where: { applicationId: id },
orderBy: { createdAt: 'desc' },
take: 5,
skip
take: 5 + skip
});
}
builds = builds.map((build) => {
const updatedAt = day(build.updatedAt).utc();
build.took = updatedAt.diff(day(build.createdAt)) / 1000;
build.since = updatedAt.fromNow();
return build;
});
if (build.status === 'running') {
build.elapsed = (day().utc().diff(day(build.createdAt)) / 1000).toFixed(0);
}
return build
})
return {
builds,
buildCount
@@ -1009,22 +1320,49 @@ export async function getBuildLogs(request: FastifyRequest<GetBuildLogs>) {
export async function getBuildIdLogs(request: FastifyRequest<GetBuildIdLogs>) {
try {
const { buildId } = request.params
// TODO: Fluentbit could still hold the logs, so we need to check if the logs are done
const { buildId, id } = request.params
let { sequence = 0 } = request.query
if (typeof sequence !== 'number') {
sequence = Number(sequence)
}
let logs = await prisma.buildLog.findMany({
where: { buildId, time: { gt: sequence } },
orderBy: { time: 'asc' }
});
let file = `/app/logs/${id}_buildlog_${buildId}.csv`
if (isDev) {
file = `${process.cwd()}/../../logs/${id}_buildlog_${buildId}.csv`
}
const data = await prisma.build.findFirst({ where: { id: buildId } });
const createdAt = day(data.createdAt).utc();
try {
await fs.stat(file)
} catch (error) {
let logs = await prisma.buildLog.findMany({
where: { buildId, time: { gt: sequence } },
orderBy: { time: 'asc' }
});
const data = await prisma.build.findFirst({ where: { id: buildId } });
const createdAt = day(data.createdAt).utc();
return {
logs: logs.map(log => {
log.time = Number(log.time)
return log
}),
fromDb: true,
took: day().diff(createdAt) / 1000,
status: data?.status || 'queued'
}
}
let fileLogs = (await fs.readFile(file)).toString()
let decryptedLogs = await csv({ noheader: true }).fromString(fileLogs)
let logs = decryptedLogs.map(log => {
const parsed = {
time: log['field1'],
line: decrypt(log['field2'] + '","' + log['field3'])
}
return parsed
}).filter(log => log.time > sequence)
return {
logs: logs.map(log => {
log.time = Number(log.time)
return log
}),
logs,
fromDb: false,
took: day().diff(createdAt) / 1000,
status: data?.status || 'queued'
}

View File

@@ -1,8 +1,8 @@
import { FastifyPluginAsync } from 'fastify';
import { OnlyId } from '../../../../types';
import { cancelDeployment, checkDNS, checkDomain, checkRepository, deleteApplication, deleteSecret, deleteStorage, deployApplication, getApplication, getApplicationLogs, getApplicationStatus, getBuildIdLogs, getBuildLogs, getBuildPack, getGitHubToken, getGitLabSSHKey, getImages, getPreviews, getSecrets, getStorages, getUsage, listApplications, newApplication, restartApplication, saveApplication, saveApplicationSettings, saveApplicationSource, saveBuildPack, saveConnectedDatabase, saveDeployKey, saveDestination, saveGitLabSSHKey, saveRepository, saveSecret, saveStorage, stopApplication, stopPreviewApplication } from './handlers';
import { cancelDeployment, checkDNS, checkDomain, checkRepository, cleanupUnconfiguredApplications, deleteApplication, deleteSecret, deleteStorage, deployApplication, getApplication, getApplicationLogs, getApplicationStatus, getBuildIdLogs, getBuildPack, getBuilds, getGitHubToken, getGitLabSSHKey, getImages, getPreviews, getPreviewStatus, getSecrets, getStorages, getUsage, getUsageByContainer, listApplications, loadPreviews, newApplication, restartApplication, restartPreview, saveApplication, saveApplicationSettings, saveApplicationSource, saveBuildPack, saveConnectedDatabase, saveDeployKey, saveDestination, saveGitLabSSHKey, saveRepository, saveSecret, saveStorage, stopApplication, stopPreviewApplication, updatePreviewSecret, updateSecret } from './handlers';
import type { CancelDeployment, CheckDNS, CheckDomain, CheckRepository, DeleteApplication, DeleteSecret, DeleteStorage, DeployApplication, GetApplicationLogs, GetBuildIdLogs, GetBuildLogs, GetImages, SaveApplication, SaveApplicationSettings, SaveApplicationSource, SaveDeployKey, SaveDestination, SaveSecret, SaveStorage, StopPreviewApplication } from './types';
import type { CancelDeployment, CheckDNS, CheckDomain, CheckRepository, DeleteApplication, DeleteSecret, DeleteStorage, DeployApplication, GetApplicationLogs, GetBuildIdLogs, GetBuilds, GetImages, RestartPreviewApplication, SaveApplication, SaveApplicationSettings, SaveApplicationSource, SaveDeployKey, SaveDestination, SaveSecret, SaveStorage, StopPreviewApplication } from './types';
const root: FastifyPluginAsync = async (fastify): Promise<void> => {
fastify.addHook('onRequest', async (request) => {
@@ -11,6 +11,8 @@ const root: FastifyPluginAsync = async (fastify): Promise<void> => {
fastify.get('/', async (request) => await listApplications(request));
fastify.post<GetImages>('/images', async (request) => await getImages(request));
fastify.post<any>('/cleanup/unconfigured', async (request) => await cleanupUnconfiguredApplications(request));
fastify.post('/new', async (request, reply) => await newApplication(request, reply));
fastify.get<OnlyId>('/:id', async (request) => await getApplication(request));
@@ -30,6 +32,8 @@ const root: FastifyPluginAsync = async (fastify): Promise<void> => {
fastify.get<OnlyId>('/:id/secrets', async (request) => await getSecrets(request));
fastify.post<SaveSecret>('/:id/secrets', async (request, reply) => await saveSecret(request, reply));
fastify.put<SaveSecret>('/:id/secrets', async (request, reply) => await updateSecret(request, reply));
fastify.put<SaveSecret>('/:id/secrets/preview', async (request, reply) => await updatePreviewSecret(request, reply));
fastify.delete<DeleteSecret>('/:id/secrets', async (request) => await deleteSecret(request));
fastify.get<OnlyId>('/:id/storages', async (request) => await getStorages(request));
@@ -37,12 +41,17 @@ const root: FastifyPluginAsync = async (fastify): Promise<void> => {
fastify.delete<DeleteStorage>('/:id/storages', async (request) => await deleteStorage(request));
fastify.get<OnlyId>('/:id/previews', async (request) => await getPreviews(request));
fastify.post<OnlyId>('/:id/previews/load', async (request) => await loadPreviews(request));
fastify.get<RestartPreviewApplication>('/:id/previews/:pullmergeRequestId/status', async (request) => await getPreviewStatus(request));
fastify.post<RestartPreviewApplication>('/:id/previews/:pullmergeRequestId/restart', async (request, reply) => await restartPreview(request, reply));
fastify.get<GetApplicationLogs>('/:id/logs', async (request) => await getApplicationLogs(request));
fastify.get<GetBuildLogs>('/:id/logs/build', async (request) => await getBuildLogs(request));
// fastify.get<GetApplicationLogs>('/:id/logs', async (request) => await getApplicationLogs(request));
fastify.get<GetApplicationLogs>('/:id/logs/:containerId', async (request) => await getApplicationLogs(request));
fastify.get<GetBuilds>('/:id/logs/build', async (request) => await getBuilds(request));
fastify.get<GetBuildIdLogs>('/:id/logs/build/:buildId', async (request) => await getBuildIdLogs(request));
fastify.get('/:id/usage', async (request) => await getUsage(request))
fastify.get('/:id/usage/:containerId', async (request) => await getUsageByContainer(request))
fastify.post<DeployApplication>('/:id/deploy', async (request) => await deployApplication(request))
fastify.post<CancelDeployment>('/:id/cancel', async (request, reply) => await cancelDeployment(request, reply));

View File

@@ -21,12 +21,15 @@ export interface SaveApplication extends OnlyId {
baseImage: string,
baseBuildImage: string,
deploymentType: string,
baseDatabaseBranch: string
baseDatabaseBranch: string,
dockerComposeFile: string,
dockerComposeFileLocation: string,
dockerComposeConfiguration: string
}
}
export interface SaveApplicationSettings extends OnlyId {
Querystring: { domain: string; };
Body: { debug: boolean; previews: boolean; dualCerts: boolean; autodeploy: boolean; branch: string; projectId: number; isBot: boolean; isDBBranching: boolean };
Body: { debug: boolean; previews: boolean; dualCerts: boolean; autodeploy: boolean; branch: string; projectId: number; isBot: boolean; isDBBranching: boolean, isCustomSSL: boolean };
}
export interface DeleteApplication extends OnlyId {
Querystring: { domain: string; };
@@ -65,7 +68,7 @@ export interface SaveSecret extends OnlyId {
name: string,
value: string,
isBuildSecret: boolean,
isPRMRSecret: boolean,
previewSecret: boolean,
isNew: boolean
}
}
@@ -84,12 +87,16 @@ export interface DeleteStorage extends OnlyId {
path: string,
}
}
export interface GetApplicationLogs extends OnlyId {
export interface GetApplicationLogs {
Params: {
id: string,
containerId: string
}
Querystring: {
since: number,
}
}
export interface GetBuildLogs extends OnlyId {
export interface GetBuilds extends OnlyId {
Querystring: {
buildId: string
skip: number,
@@ -97,6 +104,7 @@ export interface GetBuildLogs extends OnlyId {
}
export interface GetBuildIdLogs {
Params: {
id: string,
buildId: string
},
Querystring: {
@@ -126,4 +134,10 @@ export interface StopPreviewApplication extends OnlyId {
Body: {
pullmergeRequestId: string | null,
}
}
export interface RestartPreviewApplication {
Params: {
id: string,
pullmergeRequestId: string | null,
}
}

View File

@@ -2,12 +2,13 @@ import { FastifyPluginAsync } from 'fastify';
import { errorHandler, listSettings, version } from '../../../../lib/common';
const root: FastifyPluginAsync = async (fastify): Promise<void> => {
fastify.get('/', async () => {
fastify.get('/', async (request) => {
const teamId = request.user?.teamId;
const settings = await listSettings()
try {
return {
ipv4: settings.ipv4,
ipv6: settings.ipv6,
ipv4: teamId ? settings.ipv4 : 'nope',
ipv6: teamId ? settings.ipv6 : 'nope',
version,
whiteLabeled: process.env.COOLIFY_WHITE_LABELED === 'true',
whiteLabeledIcon: process.env.COOLIFY_WHITE_LABELED_ICON,

View File

@@ -51,6 +51,30 @@ export async function newDatabase(request: FastifyRequest, reply: FastifyReply)
return errorHandler({ status, message })
}
}
export async function cleanupUnconfiguredDatabases(request: FastifyRequest) {
try {
const teamId = request.user.teamId;
let databases = await prisma.database.findMany({
where: { teams: { some: { id: teamId === "0" ? undefined : teamId } } },
include: { settings: true, destinationDocker: true, teams: true },
});
for (const database of databases) {
if (!database?.version) {
const { id } = database;
if (database.destinationDockerId) {
const everStarted = await stopDatabaseContainer(database);
if (everStarted) await stopTcpHttpProxy(id, database.destinationDocker, database.publicPort);
}
await prisma.databaseSettings.deleteMany({ where: { databaseId: id } });
await prisma.databaseSecret.deleteMany({ where: { databaseId: id } });
await prisma.database.delete({ where: { id } });
}
}
return {}
} catch ({ status, message }) {
return errorHandler({ status, message })
}
}
export async function getDatabaseStatus(request: FastifyRequest<OnlyId>) {
try {
const { id } = request.params;

View File

@@ -1,5 +1,5 @@
import { FastifyPluginAsync } from 'fastify';
import { deleteDatabase, deleteDatabaseSecret, getDatabase, getDatabaseLogs, getDatabaseSecrets, getDatabaseStatus, getDatabaseTypes, getDatabaseUsage, getVersions, listDatabases, newDatabase, saveDatabase, saveDatabaseDestination, saveDatabaseSecret, saveDatabaseSettings, saveDatabaseType, saveVersion, startDatabase, stopDatabase } from './handlers';
import { cleanupUnconfiguredDatabases, deleteDatabase, deleteDatabaseSecret, getDatabase, getDatabaseLogs, getDatabaseSecrets, getDatabaseStatus, getDatabaseTypes, getDatabaseUsage, getVersions, listDatabases, newDatabase, saveDatabase, saveDatabaseDestination, saveDatabaseSecret, saveDatabaseSettings, saveDatabaseType, saveVersion, startDatabase, stopDatabase } from './handlers';
import type { OnlyId } from '../../../../types';
@@ -12,6 +12,8 @@ const root: FastifyPluginAsync = async (fastify): Promise<void> => {
fastify.get('/', async (request) => await listDatabases(request));
fastify.post('/new', async (request, reply) => await newDatabase(request, reply));
fastify.post<any>('/cleanup/unconfigured', async (request) => await cleanupUnconfiguredDatabases(request));
fastify.get<OnlyId>('/:id', async (request) => await getDatabase(request));
fastify.post<SaveDatabase>('/:id', async (request, reply) => await saveDatabase(request, reply));
fastify.delete<DeleteDatabase>('/:id', async (request) => await deleteDatabase(request));

View File

@@ -4,7 +4,7 @@ import sshConfig from 'ssh-config'
import fs from 'fs/promises'
import os from 'os';
import { asyncExecShell, createRemoteEngineConfiguration, decrypt, errorHandler, executeDockerCmd, listSettings, prisma, startTraefikProxy, stopTraefikProxy } from '../../../../lib/common';
import { asyncExecShell, createRemoteEngineConfiguration, decrypt, errorHandler, executeDockerCmd, executeSSHCmd, listSettings, prisma, startTraefikProxy, stopTraefikProxy } from '../../../../lib/common';
import { checkContainer } from '../../../../lib/docker';
import type { OnlyId } from '../../../../types';
@@ -202,25 +202,58 @@ export async function assignSSHKey(request: FastifyRequest) {
return errorHandler({ status, message })
}
}
export async function verifyRemoteDockerEngine(request: FastifyRequest<OnlyId>, reply: FastifyReply) {
export async function verifyRemoteDockerEngineFn(id: string) {
await createRemoteEngineConfiguration(id);
const { remoteIpAddress, network, isCoolifyProxyUsed } = await prisma.destinationDocker.findFirst({ where: { id } })
const host = `ssh://${remoteIpAddress}-remote`
const { stdout } = await asyncExecShell(`DOCKER_HOST=${host} docker network ls --filter 'name=${network}' --no-trunc --format "{{json .}}"`);
if (!stdout) {
await asyncExecShell(`DOCKER_HOST=${host} docker network create --attachable ${network}`);
}
const { stdout: coolifyNetwork } = await asyncExecShell(`DOCKER_HOST=${host} docker network ls --filter 'name=coolify-infra' --no-trunc --format "{{json .}}"`);
if (!coolifyNetwork) {
await asyncExecShell(`DOCKER_HOST=${host} docker network create --attachable coolify-infra`);
}
if (isCoolifyProxyUsed) await startTraefikProxy(id);
try {
const { id } = request.params;
await createRemoteEngineConfiguration(id);
const { remoteIpAddress, remoteUser, network, isCoolifyProxyUsed } = await prisma.destinationDocker.findFirst({ where: { id } })
const host = `ssh://${remoteUser}@${remoteIpAddress}`
const { stdout } = await asyncExecShell(`DOCKER_HOST=${host} docker network ls --filter 'name=${network}' --no-trunc --format "{{json .}}"`);
if (!stdout) {
await asyncExecShell(`DOCKER_HOST=${host} docker network create --attachable ${network}`);
}
const { stdout: coolifyNetwork } = await asyncExecShell(`DOCKER_HOST=${host} docker network ls --filter 'name=coolify-infra' --no-trunc --format "{{json .}}"`);
if (!coolifyNetwork) {
await asyncExecShell(`DOCKER_HOST=${host} docker network create --attachable coolify-infra`);
}
if (isCoolifyProxyUsed) await startTraefikProxy(id);
await prisma.destinationDocker.update({ where: { id }, data: { remoteVerified: true } })
return reply.code(201).send()
const { stdout: daemonJson } = await executeSSHCmd({ dockerId: id, command: `cat /etc/docker/daemon.json` });
let daemonJsonParsed = JSON.parse(daemonJson);
let isUpdated = false;
if (!daemonJsonParsed['live-restore'] || daemonJsonParsed['live-restore'] !== true) {
isUpdated = true;
daemonJsonParsed['live-restore'] = true
}
if (!daemonJsonParsed?.features?.buildkit) {
isUpdated = true;
daemonJsonParsed.features = {
buildkit: true
}
}
if (isUpdated) {
await executeSSHCmd({ dockerId: id, command: `echo '${JSON.stringify(daemonJsonParsed)}' > /etc/docker/daemon.json` });
await executeSSHCmd({ dockerId: id, command: `systemctl restart docker` });
}
} catch (error) {
const daemonJsonParsed = {
"live-restore": true,
"features": {
"buildkit": true
}
}
await executeSSHCmd({ dockerId: id, command: `echo '${JSON.stringify(daemonJsonParsed)}' > /etc/docker/daemon.json` });
await executeSSHCmd({ dockerId: id, command: `systemctl restart docker` });
} finally {
await prisma.destinationDocker.update({ where: { id }, data: { remoteVerified: true } })
}
}
export async function verifyRemoteDockerEngine(request: FastifyRequest<OnlyId>, reply: FastifyReply) {
const { id } = request.params;
try {
await verifyRemoteDockerEngineFn(id);
return reply.code(201).send()
} catch ({ status, message }) {
await prisma.destinationDocker.update({ where: { id }, data: { remoteVerified: false } })
return errorHandler({ status, message })
}
}
@@ -229,7 +262,7 @@ export async function getDestinationStatus(request: FastifyRequest<OnlyId>) {
try {
const { id } = request.params
const destination = await prisma.destinationDocker.findUnique({ where: { id } })
const isRunning = await checkContainer({ dockerId: destination.id, container: 'coolify-proxy', remove: true })
const { found: isRunning } = await checkContainer({ dockerId: destination.id, container: 'coolify-proxy', remove: true })
return {
isRunning
}

View File

@@ -1,13 +1,23 @@
import axios from 'axios';
import { compareVersions } from 'compare-versions';
import cuid from 'cuid';
import bcrypt from 'bcryptjs';
import { asyncExecShell, asyncSleep, cleanupDockerStorage, errorHandler, isDev, listSettings, prisma, uniqueName, version } from '../../../lib/common';
import { supportedServiceTypesAndVersions } from '../../../lib/services/supportedVersions';
import type { FastifyReply, FastifyRequest } from 'fastify';
import type { Login, Update } from '.';
import type { GetCurrentUser } from './types';
import { compareVersions } from "compare-versions";
import cuid from "cuid";
import bcrypt from "bcryptjs";
import fs from 'fs/promises';
import yaml from 'js-yaml';
import {
asyncExecShell,
asyncSleep,
cleanupDockerStorage,
errorHandler,
isDev,
listSettings,
prisma,
uniqueName,
version,
} from "../../../lib/common";
import { scheduler } from "../../../lib/scheduler";
import type { FastifyReply, FastifyRequest } from "fastify";
import type { Login, Update } from ".";
import type { GetCurrentUser } from "./types";
export async function hashPassword(password: string): Promise<string> {
const saltRounds = 15;
@@ -17,34 +27,90 @@ export async function hashPassword(password: string): Promise<string> {
export async function cleanupManually(request: FastifyRequest) {
try {
const { serverId } = request.body;
const destination = await prisma.destinationDocker.findUnique({ where: { id: serverId } })
await cleanupDockerStorage(destination.id, true, true)
return {}
const destination = await prisma.destinationDocker.findUnique({
where: { id: serverId },
});
await cleanupDockerStorage(destination.id, true, true);
return {};
} catch ({ status, message }) {
return errorHandler({ status, message })
return errorHandler({ status, message });
}
}
export async function refreshTags() {
try {
const { default: got } = await import('got')
try {
if (isDev) {
const tags = await fs.readFile('./devTags.json', 'utf8')
await fs.writeFile('./tags.json', tags)
} else {
const tags = await got.get('https://get.coollabs.io/coolify/service-tags.json').text()
await fs.writeFile('/app/tags.json', tags)
}
} catch (error) {
console.log(error)
throw {
status: 500,
message: 'Could not fetch templates from get.coollabs.io'
};
}
return {};
} catch ({ status, message }) {
return errorHandler({ status, message });
}
}
export async function refreshTemplates() {
try {
const { default: got } = await import('got')
try {
if (isDev) {
const response = await fs.readFile('./devTemplates.yaml', 'utf8')
await fs.writeFile('./templates.json', JSON.stringify(yaml.load(response)))
} else {
const response = await got.get('https://get.coollabs.io/coolify/service-templates.yaml').text()
await fs.writeFile('/app/templates.json', JSON.stringify(yaml.load(response)))
}
} catch (error) {
console.log(error)
throw {
status: 500,
message: 'Could not fetch templates from get.coollabs.io'
};
}
return {};
} catch ({ status, message }) {
return errorHandler({ status, message });
}
}
export async function checkUpdate(request: FastifyRequest) {
try {
const isStaging = request.hostname === 'staging.coolify.io'
const { default: got } = await import('got')
const isStaging =
request.hostname === "staging.coolify.io" ||
request.hostname === "arm.coolify.io";
const currentVersion = version;
const { data: versions } = await axios.get(
`https://get.coollabs.io/versions.json?appId=${process.env['COOLIFY_APP_ID']}&version=${currentVersion}`
);
const latestVersion = versions['coolify'].main.version
const { coolify } = await got.get('https://get.coollabs.io/versions.json', {
searchParams: {
appId: process.env['COOLIFY_APP_ID'] || undefined,
version: currentVersion
}
}).json()
const latestVersion = coolify.main.version;
const isUpdateAvailable = compareVersions(latestVersion, currentVersion);
if (isStaging) {
return {
isUpdateAvailable: true,
latestVersion: 'next'
}
latestVersion: "next",
};
}
return {
isUpdateAvailable: isStaging ? true : isUpdateAvailable === 1,
latestVersion
latestVersion,
};
} catch ({ status, message }) {
return errorHandler({ status, message })
return errorHandler({ status, message });
}
}
@@ -59,7 +125,7 @@ export async function update(request: FastifyRequest<Update>) {
`sed -i '/COOLIFY_AUTO_UPDATE=/cCOOLIFY_AUTO_UPDATE=${isAutoUpdateEnabled}' .env`
);
await asyncExecShell(
`docker run --rm -tid --env-file .env -v /var/run/docker.sock:/var/run/docker.sock -v coolify-db coollabsio/coolify:${latestVersion} /bin/sh -c "env | grep COOLIFY > .env && echo 'TAG=${latestVersion}' >> .env && docker stop -t 0 coolify && docker rm coolify && docker compose up -d --force-recreate"`
`docker run --rm -tid --env-file .env -v /var/run/docker.sock:/var/run/docker.sock -v coolify-db coollabsio/coolify:${latestVersion} /bin/sh -c "env | grep COOLIFY > .env && echo 'TAG=${latestVersion}' >> .env && docker stop -t 0 coolify coolify-fluentbit && docker rm coolify coolify-fluentbit && docker compose pull && docker compose up -d --force-recreate"`
);
return {};
} else {
@@ -67,13 +133,27 @@ export async function update(request: FastifyRequest<Update>) {
return {};
}
} catch ({ status, message }) {
return errorHandler({ status, message })
return errorHandler({ status, message });
}
}
export async function resetQueue(request: FastifyRequest<any>) {
try {
const teamId = request.user.teamId;
if (teamId === "0") {
await prisma.build.updateMany({
where: { status: { in: ["queued", "running"] } },
data: { status: "canceled" },
});
scheduler.workers.get("deployApplication").postMessage("cancel");
}
} catch ({ status, message }) {
return errorHandler({ status, message });
}
}
export async function restartCoolify(request: FastifyRequest<any>) {
try {
const teamId = request.user.teamId;
if (teamId === '0') {
if (teamId === "0") {
if (!isDev) {
asyncExecShell(`docker restart coolify`);
return {};
@@ -81,9 +161,12 @@ export async function restartCoolify(request: FastifyRequest<any>) {
return {};
}
}
throw { status: 500, message: 'You are not authorized to restart Coolify.' };
throw {
status: 500,
message: "You are not authorized to restart Coolify.",
};
} catch ({ status, message }) {
return errorHandler({ status, message })
return errorHandler({ status, message });
}
}
@@ -91,28 +174,50 @@ export async function showDashboard(request: FastifyRequest) {
try {
const userId = request.user.userId;
const teamId = request.user.teamId;
const applications = await prisma.application.findMany({
where: { teams: { some: { id: teamId === '0' ? undefined : teamId } } },
include: { settings: true, destinationDocker: true, teams: true }
let applications = await prisma.application.findMany({
where: { teams: { some: { id: teamId === "0" ? undefined : teamId } } },
include: { settings: true, destinationDocker: true, teams: true },
});
const databases = await prisma.database.findMany({
where: { teams: { some: { id: teamId === '0' ? undefined : teamId } } },
include: { settings: true, destinationDocker: true, teams: true }
where: { teams: { some: { id: teamId === "0" ? undefined : teamId } } },
include: { settings: true, destinationDocker: true, teams: true },
});
const services = await prisma.service.findMany({
where: { teams: { some: { id: teamId === '0' ? undefined : teamId } } },
include: { destinationDocker: true, teams: true }
where: { teams: { some: { id: teamId === "0" ? undefined : teamId } } },
include: { destinationDocker: true, teams: true },
});
const gitSources = await prisma.gitSource.findMany({
where: { teams: { some: { id: teamId === '0' ? undefined : teamId } } },
include: { teams: true }
where: { OR: [{ teams: { some: { id: teamId === "0" ? undefined : teamId } } }, { isSystemWide: true }] },
include: { teams: true },
});
const destinations = await prisma.destinationDocker.findMany({
where: { teams: { some: { id: teamId === '0' ? undefined : teamId } } },
include: { teams: true }
where: { teams: { some: { id: teamId === "0" ? undefined : teamId } } },
include: { teams: true },
});
const settings = await listSettings();
let foundUnconfiguredApplication = false;
for (const application of applications) {
if (!application.buildPack || !application.destinationDockerId || !application.branch || (!application.settings?.isBot && !application?.fqdn) && application.buildPack !== "compose") {
foundUnconfiguredApplication = true
}
}
let foundUnconfiguredService = false;
for (const service of services) {
if (!service.fqdn) {
foundUnconfiguredService = true
}
}
let foundUnconfiguredDatabase = false;
for (const database of databases) {
if (!database.version) {
foundUnconfiguredDatabase = true
}
}
return {
foundUnconfiguredApplication,
foundUnconfiguredDatabase,
foundUnconfiguredService,
applications,
databases,
services,
@@ -121,88 +226,98 @@ export async function showDashboard(request: FastifyRequest) {
settings,
};
} catch ({ status, message }) {
return errorHandler({ status, message })
return errorHandler({ status, message });
}
}
export async function login(request: FastifyRequest<Login>, reply: FastifyReply) {
export async function login(
request: FastifyRequest<Login>,
reply: FastifyReply
) {
if (request.user) {
return reply.redirect('/dashboard');
return reply.redirect("/dashboard");
} else {
const { email, password, isLogin } = request.body || {};
if (!email || !password) {
throw { status: 500, message: 'Email and password are required.' };
throw { status: 500, message: "Email and password are required." };
}
const users = await prisma.user.count();
const userFound = await prisma.user.findUnique({
where: { email },
include: { teams: true, permission: true },
rejectOnNotFound: false
rejectOnNotFound: false,
});
if (!userFound && isLogin) {
throw { status: 500, message: 'User not found.' };
throw { status: 500, message: "User not found." };
}
const { isRegistrationEnabled, id } = await prisma.setting.findFirst()
const { isRegistrationEnabled, id } = await prisma.setting.findFirst();
let uid = cuid();
let permission = 'read';
let permission = "read";
let isAdmin = false;
if (users === 0) {
await prisma.setting.update({ where: { id }, data: { isRegistrationEnabled: false } });
uid = '0';
await prisma.setting.update({
where: { id },
data: { isRegistrationEnabled: false },
});
uid = "0";
}
if (userFound) {
if (userFound.type === 'email') {
if (userFound.password === 'RESETME') {
if (userFound.type === "email") {
if (userFound.password === "RESETME") {
const hashedPassword = await hashPassword(password);
if (userFound.updatedAt < new Date(Date.now() - 1000 * 60 * 10)) {
if (userFound.id === '0') {
if (userFound.id === "0") {
await prisma.user.update({
where: { email: userFound.email },
data: { password: 'RESETME' }
data: { password: "RESETME" },
});
} else {
await prisma.user.update({
where: { email: userFound.email },
data: { password: 'RESETTIMEOUT' }
data: { password: "RESETTIMEOUT" },
});
}
throw {
status: 500,
message: 'Password reset link has expired. Please request a new one.'
message:
"Password reset link has expired. Please request a new one.",
};
} else {
await prisma.user.update({
where: { email: userFound.email },
data: { password: hashedPassword }
data: { password: hashedPassword },
});
return {
userId: userFound.id,
teamId: userFound.id,
permission: userFound.permission,
isAdmin: true
isAdmin: true,
};
}
}
const passwordMatch = await bcrypt.compare(password, userFound.password);
const passwordMatch = await bcrypt.compare(
password,
userFound.password
);
if (!passwordMatch) {
throw {
status: 500,
message: 'Wrong password or email address.'
message: "Wrong password or email address.",
};
}
uid = userFound.id;
isAdmin = true;
}
} else {
permission = 'owner';
permission = "owner";
isAdmin = true;
if (!isRegistrationEnabled) {
throw {
status: 404,
message: 'Registration disabled by administrator.'
message: "Registration disabled by administrator.",
};
}
const hashedPassword = await hashPassword(password);
@@ -212,17 +327,17 @@ export async function login(request: FastifyRequest<Login>, reply: FastifyReply)
id: uid,
email,
password: hashedPassword,
type: 'email',
type: "email",
teams: {
create: {
id: uid,
name: uniqueName(),
destinationDocker: { connect: { network: 'coolify' } }
}
destinationDocker: { connect: { network: "coolify" } },
},
},
permission: { create: { teamId: uid, permission: 'owner' } }
permission: { create: { teamId: uid, permission: "owner" } },
},
include: { teams: true }
include: { teams: true },
});
} else {
await prisma.user.create({
@@ -230,16 +345,16 @@ export async function login(request: FastifyRequest<Login>, reply: FastifyReply)
id: uid,
email,
password: hashedPassword,
type: 'email',
type: "email",
teams: {
create: {
id: uid,
name: uniqueName()
}
name: uniqueName(),
},
},
permission: { create: { teamId: uid, permission: 'owner' } }
permission: { create: { teamId: uid, permission: "owner" } },
},
include: { teams: true }
include: { teams: true },
});
}
}
@@ -247,18 +362,21 @@ export async function login(request: FastifyRequest<Login>, reply: FastifyReply)
userId: uid,
teamId: uid,
permission,
isAdmin
isAdmin,
};
}
}
export async function getCurrentUser(request: FastifyRequest<GetCurrentUser>, fastify) {
let token = null
const { teamId } = request.query
export async function getCurrentUser(
request: FastifyRequest<GetCurrentUser>,
fastify
) {
let token = null;
const { teamId } = request.query;
try {
const user = await prisma.user.findUnique({
where: { id: request.user.userId }
})
where: { id: request.user.userId },
});
if (!user) {
throw "User not found";
}
@@ -269,28 +387,29 @@ export async function getCurrentUser(request: FastifyRequest<GetCurrentUser>, fa
try {
const user = await prisma.user.findFirst({
where: { id: request.user.userId, teams: { some: { id: teamId } } },
include: { teams: true, permission: true }
})
include: { teams: true, permission: true },
});
if (user) {
const permission = user.permission.find(p => p.teamId === teamId).permission
const permission = user.permission.find(
(p) => p.teamId === teamId
).permission;
const payload = {
...request.user,
teamId,
permission: permission || null,
isAdmin: permission === 'owner' || permission === 'admin'
}
token = fastify.jwt.sign(payload)
isAdmin: permission === "owner" || permission === "admin",
};
token = fastify.jwt.sign(payload);
}
} catch (error) {
// No new token -> not switching teams
}
}
const pendingInvitations = await prisma.teamInvitation.findMany({ where: { uid: request.user.userId } })
return {
settings: await prisma.setting.findFirst(),
supportedServiceTypesAndVersions,
pendingInvitations,
token,
...request.user
}
...request.user,
};
}

View File

@@ -5,9 +5,10 @@ import { decrypt, errorHandler, prisma, uniqueName } from '../../../../lib/commo
import { day } from '../../../../lib/dayjs';
import type { OnlyId } from '../../../../types';
import type { BodyId, InviteToTeam, SaveTeam, SetPermission } from './types';
import type { BodyId, DeleteUserFromTeam, InviteToTeam, SaveTeam, SetPermission } from './types';
export async function listTeams(request: FastifyRequest) {
export async function listAccounts(request: FastifyRequest) {
try {
const userId = request.user.userId;
const teamId = request.user.teamId;
@@ -15,10 +16,24 @@ export async function listTeams(request: FastifyRequest) {
where: { id: userId },
select: { id: true, email: true, teams: true }
});
let accounts = [];
let allTeams = [];
let accounts = await prisma.user.findMany({ where: { teams: { some: { id: teamId } } }, select: { id: true, email: true, teams: true } });
if (teamId === '0') {
accounts = await prisma.user.findMany({ select: { id: true, email: true, teams: true } });
}
return {
account,
accounts
};
} catch ({ status, message }) {
return errorHandler({ status, message })
}
}
export async function listTeams(request: FastifyRequest) {
try {
const userId = request.user.userId;
const teamId = request.user.teamId;
let allTeams = [];
if (teamId === '0') {
allTeams = await prisma.team.findMany({
where: { users: { none: { id: userId } } },
include: { permissions: true }
@@ -28,18 +43,30 @@ export async function listTeams(request: FastifyRequest) {
where: { users: { some: { id: userId } } },
include: { permissions: true }
});
const invitations = await prisma.teamInvitation.findMany({ where: { uid: userId } });
return {
ownTeams,
allTeams,
invitations,
account,
accounts
};
} catch ({ status, message }) {
return errorHandler({ status, message })
}
}
export async function removeUserFromTeam(request: FastifyRequest<DeleteUserFromTeam>, reply: FastifyReply) {
try {
const { uid } = request.body;
const { id } = request.params;
const userId = request.user.userId;
const foundUser = await prisma.team.findMany({ where: { id, users: { some: { id: userId } } } });
if (foundUser.length === 0) {
return errorHandler({ status: 404, message: 'Team not found' });
}
await prisma.team.update({ where: { id }, data: { users: { disconnect: { id: uid } } } });
await prisma.permission.deleteMany({ where: { teamId: id, userId: uid } })
return reply.code(201).send()
} catch ({ status, message }) {
return errorHandler({ status, message })
}
}
export async function deleteTeam(request: FastifyRequest<OnlyId>, reply: FastifyReply) {
try {
const userId = request.user.userId;

View File

@@ -1,19 +1,22 @@
import { FastifyPluginAsync } from 'fastify';
import { acceptInvitation, changePassword, deleteTeam, getTeam, inviteToTeam, listTeams, newTeam, removeUser, revokeInvitation, saveTeam, setPermission } from './handlers';
import { acceptInvitation, changePassword, deleteTeam, getTeam, inviteToTeam, listAccounts, listTeams, newTeam, removeUser, removeUserFromTeam, revokeInvitation, saveTeam, setPermission } from './handlers';
import type { OnlyId } from '../../../../types';
import type { BodyId, InviteToTeam, SaveTeam, SetPermission } from './types';
import type { BodyId, DeleteUserFromTeam, InviteToTeam, SaveTeam, SetPermission } from './types';
const root: FastifyPluginAsync = async (fastify): Promise<void> => {
fastify.addHook('onRequest', async (request) => {
return await request.jwtVerify()
})
fastify.get('/', async (request) => await listTeams(request));
fastify.get('/', async (request) => await listAccounts(request));
fastify.post('/new', async (request, reply) => await newTeam(request, reply));
fastify.get('/teams', async (request) => await listTeams(request));
fastify.get<OnlyId>('/team/:id', async (request, reply) => await getTeam(request, reply));
fastify.post<SaveTeam>('/team/:id', async (request, reply) => await saveTeam(request, reply));
fastify.delete<OnlyId>('/team/:id', async (request, reply) => await deleteTeam(request, reply));
fastify.post<DeleteUserFromTeam>('/team/:id/user/remove', async (request, reply) => await removeUserFromTeam(request, reply));
fastify.post<InviteToTeam>('/team/:id/invitation/invite', async (request, reply) => await inviteToTeam(request, reply))
fastify.post<BodyId>('/team/:id/invitation/accept', async (request) => await acceptInvitation(request));
@@ -23,7 +26,6 @@ const root: FastifyPluginAsync = async (fastify): Promise<void> => {
fastify.delete<BodyId>('/user/remove', async (request, reply) => await removeUser(request, reply));
fastify.post<BodyId>('/user/password', async (request, reply) => await changePassword(request, reply));
// fastify.delete('/user', async (request, reply) => await deleteUser(request, reply));
};

View File

@@ -5,6 +5,14 @@ export interface SaveTeam extends OnlyId {
name: string
}
}
export interface DeleteUserFromTeam {
Body: {
uid: string
},
Params: {
id: string
}
}
export interface InviteToTeam {
Body: {
email: string,

View File

@@ -1,5 +1,5 @@
import { FastifyPluginAsync } from 'fastify';
import { checkUpdate, login, showDashboard, update, showUsage, getCurrentUser, cleanupManually, restartCoolify } from './handlers';
import { checkUpdate, login, showDashboard, update, resetQueue, getCurrentUser, cleanupManually, restartCoolify, refreshTemplates } from './handlers';
import { GetCurrentUser } from './types';
export interface Update {
@@ -23,9 +23,7 @@ const root: FastifyPluginAsync = async (fastify): Promise<void> => {
onRequest: [fastify.authenticate]
}, async (request) => await getCurrentUser(request, fastify));
fastify.get('/undead', {
onRequest: [fastify.authenticate]
}, async function () {
fastify.get('/undead', async function () {
return { message: 'nope' };
});
@@ -47,9 +45,17 @@ const root: FastifyPluginAsync = async (fastify): Promise<void> => {
onRequest: [fastify.authenticate]
}, async (request) => await restartCoolify(request));
fastify.post('/internal/resetQueue', {
onRequest: [fastify.authenticate]
}, async (request) => await resetQueue(request));
fastify.post('/internal/cleanup', {
onRequest: [fastify.authenticate]
}, async (request) => await cleanupManually(request));
fastify.post('/internal/refreshTemplates', {
onRequest: [fastify.authenticate]
}, async () => await refreshTemplates());
};
export default root;

View File

@@ -8,9 +8,16 @@ export async function listServers(request: FastifyRequest) {
try {
const userId = request.user.userId;
const teamId = request.user.teamId;
const servers = await prisma.destinationDocker.findMany({ where: { teams: { some: { id: teamId === '0' ? undefined : teamId } }, remoteEngine: false }, distinct: ['engine'] })
// const remoteServers = await prisma.destinationDocker.findMany({ where: { teams: { some: { id: teamId === '0' ? undefined : teamId } } }, distinct: ['remoteIpAddress', 'engine'] })
let servers = await prisma.destinationDocker.findMany({ where: { teams: { some: { id: teamId === '0' ? undefined : teamId } } }, distinct: ['remoteIpAddress', 'engine'] })
servers = servers.filter((server) => {
if (server.remoteEngine) {
if (server.remoteVerified) {
return server
}
} else {
return server
}
})
return {
servers
}
@@ -67,8 +74,7 @@ export async function showUsage(request: FastifyRequest) {
const { stdout: stats } = await executeSSHCmd({ dockerId: id, command: `vmstat -s` })
const { stdout: disks } = await executeSSHCmd({ dockerId: id, command: `df -m / --output=size,used,pcent|grep -v 'Used'| xargs` })
const { stdout: cpus } = await executeSSHCmd({ dockerId: id, command: `nproc --all` })
// const { stdout: cpuUsage } = await executeSSHCmd({ dockerId: id, command: `echo $[100-$(vmstat 1 2|tail -1|awk '{print $15}')]` })
// console.log(cpuUsage)
const { stdout: cpuUsage } = await executeSSHCmd({ dockerId: id, command: `echo $[100-$(vmstat 1 2|tail -1|awk '{print $15}')]` })
const parsed: any = parseFromText(stats)
return {
usage: {
@@ -81,8 +87,8 @@ export async function showUsage(request: FastifyRequest) {
freeMemPercentage: (parsed.totalMemoryKB - parsed.usedMemoryKB) / parsed.totalMemoryKB * 100
},
cpu: {
load: 0,
usage: 0,
load: [0, 0, 0],
usage: cpuUsage,
count: cpus
},
disk: {

View File

@@ -1,15 +1,17 @@
import type { FastifyReply, FastifyRequest } from 'fastify';
import fs from 'fs/promises';
import yaml from 'js-yaml';
import { prisma, uniqueName, asyncExecShell, getServiceFromDB, getContainerUsage, isDomainConfigured, saveUpdateableFields, fixType, decrypt, encrypt, ComposeFile, getFreePublicPort, getDomain, errorHandler, generatePassword, isDev, stopTcpHttpProxy, executeDockerCmd, checkDomainsIsValidInDNS, checkExposedPort, listSettings } from '../../../../lib/common';
import { day } from '../../../../lib/dayjs';
import { checkContainer, isContainerExited } from '../../../../lib/docker';
import bcrypt from 'bcryptjs';
import cuid from 'cuid';
import type { OnlyId } from '../../../../types';
import { prisma, uniqueName, asyncExecShell, getServiceFromDB, getContainerUsage, isDomainConfigured, fixType, decrypt, encrypt, ComposeFile, getFreePublicPort, getDomain, errorHandler, generatePassword, isDev, stopTcpHttpProxy, executeDockerCmd, checkDomainsIsValidInDNS, checkExposedPort, listSettings } from '../../../../lib/common';
import { day } from '../../../../lib/dayjs';
import { checkContainer, } from '../../../../lib/docker';
import { removeService } from '../../../../lib/services/common';
import { getTags, getTemplates } from '../../../../lib/services';
import type { ActivateWordpressFtp, CheckService, CheckServiceDomain, DeleteServiceSecret, DeleteServiceStorage, GetServiceLogs, SaveService, SaveServiceDestination, SaveServiceSecret, SaveServiceSettings, SaveServiceStorage, SaveServiceType, SaveServiceVersion, ServiceStartStop, SetGlitchTipSettings, SetWordpressSettings } from './types';
import { supportedServiceTypesAndVersions } from '../../../../lib/services/supportedVersions';
import { configureServiceType, removeService } from '../../../../lib/services/common';
import type { OnlyId } from '../../../../types';
export async function listServices(request: FastifyRequest) {
try {
@@ -36,30 +38,217 @@ export async function newService(request: FastifyRequest, reply: FastifyReply) {
return errorHandler({ status, message })
}
}
export async function cleanupUnconfiguredServices(request: FastifyRequest) {
try {
const teamId = request.user.teamId;
let services = await prisma.service.findMany({
where: { teams: { some: { id: teamId === "0" ? undefined : teamId } } },
include: { destinationDocker: true, teams: true },
});
for (const service of services) {
if (!service.fqdn) {
if (service.destinationDockerId) {
await executeDockerCmd({
dockerId: service.destinationDockerId,
command: `docker ps -a --filter 'label=com.docker.compose.project=${service.id}' --format {{.ID}}|xargs -r -n 1 docker stop -t 0`
})
await executeDockerCmd({
dockerId: service.destinationDockerId,
command: `docker ps -a --filter 'label=com.docker.compose.project=${service.id}' --format {{.ID}}|xargs -r -n 1 docker rm --force`
})
}
await removeService({ id: service.id });
}
}
return {}
} catch ({ status, message }) {
return errorHandler({ status, message })
}
}
export async function getServiceStatus(request: FastifyRequest<OnlyId>) {
try {
const teamId = request.user.teamId;
const { id } = request.params;
let isRunning = false;
let isExited = false
const service = await getServiceFromDB({ id, teamId });
const { destinationDockerId, settings } = service;
let payload = {}
if (destinationDockerId) {
isRunning = await checkContainer({ dockerId: service.destinationDocker.id, container: id });
isExited = await isContainerExited(service.destinationDocker.id, id);
}
return {
isRunning,
isExited,
settings
const { stdout: containers } = await executeDockerCmd({
dockerId: service.destinationDocker.id,
command:
`docker ps -a --filter "label=com.docker.compose.project=${id}" --format '{{json .}}'`
});
const containersArray = containers.trim().split('\n');
if (containersArray.length > 0 && containersArray[0] !== '') {
const templates = await getTemplates();
let template = templates.find(t => t.type === service.type);
template = JSON.parse(JSON.stringify(template).replaceAll('$$id', service.id));
for (const container of containersArray) {
let isRunning = false;
let isExited = false;
let isRestarting = false;
let isExcluded = false;
const containerObj = JSON.parse(container);
const exclude = template.services[containerObj.Names]?.exclude;
if (exclude) {
payload[containerObj.Names] = {
status: {
isExcluded: true,
isRunning: false,
isExited: false,
isRestarting: false,
}
}
continue;
}
const status = containerObj.State
if (status === 'running') {
isRunning = true;
}
if (status === 'exited') {
isExited = true;
}
if (status === 'restarting') {
isRestarting = true;
}
payload[containerObj.Names] = {
status: {
isExcluded,
isRunning,
isExited,
isRestarting
}
}
}
}
}
return payload
} catch ({ status, message }) {
return errorHandler({ status, message })
}
}
export async function parseAndFindServiceTemplates(service: any, workdir?: string, isDeploy: boolean = false) {
const templates = await getTemplates()
const foundTemplate = templates.find(t => fixType(t.type) === service.type)
let parsedTemplate = {}
if (foundTemplate) {
if (!isDeploy) {
for (const [key, value] of Object.entries(foundTemplate.services)) {
const realKey = key.replace('$$id', service.id)
let name = value.name
if (!name) {
if (Object.keys(foundTemplate.services).length === 1) {
name = foundTemplate.name || service.name.toLowerCase()
} else {
if (key === '$$id') {
name = foundTemplate.name || key.replaceAll('$$id-', '') || service.name.toLowerCase()
} else {
name = key.replaceAll('$$id-', '') || service.name.toLowerCase()
}
}
}
parsedTemplate[realKey] = {
name,
documentation: value.documentation || foundTemplate.documentation || 'https://docs.coollabs.io',
image: value.image,
environment: [],
fqdns: [],
proxy: {}
}
if (value.environment?.length > 0) {
for (const env of value.environment) {
let [envKey, ...envValue] = env.split('=')
envValue = envValue.join("=")
const variable = foundTemplate.variables.find(v => v.name === envKey) || foundTemplate.variables.find(v => v.id === envValue)
if (variable) {
const id = variable.id.replaceAll('$$', '')
const label = variable?.label
const description = variable?.description
const defaultValue = variable?.defaultValue
const main = variable?.main || '$$id'
const type = variable?.type || 'input'
const placeholder = variable?.placeholder || ''
const readOnly = variable?.readOnly || false
const required = variable?.required || false
if (envValue.startsWith('$$config') || variable?.showOnConfiguration) {
if (envValue.startsWith('$$config_coolify')) {
continue
}
parsedTemplate[realKey].environment.push(
{ id, name: envKey, value: envValue, main, label, description, defaultValue, type, placeholder, required, readOnly }
)
}
}
}
}
if (value?.proxy && value.proxy.length > 0) {
for (const proxyValue of value.proxy) {
if (proxyValue.domain) {
const variable = foundTemplate.variables.find(v => v.id === proxyValue.domain)
if (variable) {
const { id, name, label, description, defaultValue, required = false } = variable
const found = await prisma.serviceSetting.findFirst({ where: { serviceId: service.id , variableName: proxyValue.domain } })
parsedTemplate[realKey].fqdns.push(
{ id, name, value: found?.value || '', label, description, defaultValue, required }
)
}
}
}
}
}
} else {
parsedTemplate = foundTemplate
}
let strParsedTemplate = JSON.stringify(parsedTemplate)
// replace $$id and $$workdir
strParsedTemplate = strParsedTemplate.replaceAll('$$id', service.id)
strParsedTemplate = strParsedTemplate.replaceAll('$$core_version', service.version || foundTemplate.defaultVersion)
// replace $$fqdn
if (workdir) {
strParsedTemplate = strParsedTemplate.replaceAll('$$workdir', workdir)
}
// replace $$config
if (service.serviceSetting.length > 0) {
for (const setting of service.serviceSetting) {
const { value, variableName } = setting
const regex = new RegExp(`\\$\\$config_${variableName.replace('$$config_', '')}\\"`, 'gi')
if (value === '$$generate_fqdn') {
strParsedTemplate = strParsedTemplate.replaceAll(regex, service.fqdn + "\"" || '' + "\"")
} else if (value === '$$generate_domain') {
strParsedTemplate = strParsedTemplate.replaceAll(regex, getDomain(service.fqdn) + "\"")
} else if (service.destinationDocker?.network && value === '$$generate_network') {
strParsedTemplate = strParsedTemplate.replaceAll(regex, service.destinationDocker.network + "\"")
} else {
strParsedTemplate = strParsedTemplate.replaceAll(regex, value + "\"")
}
}
}
// replace $$secret
if (service.serviceSecret.length > 0) {
for (const secret of service.serviceSecret) {
const { name, value } = secret
const regexHashed = new RegExp(`\\$\\$hashed\\$\\$secret_${name}\\"`, 'gi')
const regex = new RegExp(`\\$\\$secret_${name}\\"`, 'gi')
if (value) {
strParsedTemplate = strParsedTemplate.replaceAll(regexHashed, bcrypt.hashSync(value.replaceAll("\"", "\\\""), 10) + "\"")
strParsedTemplate = strParsedTemplate.replaceAll(regex, value.replaceAll("\"", "\\\"") + "\"")
} else {
strParsedTemplate = strParsedTemplate.replaceAll(regexHashed, "\"")
strParsedTemplate = strParsedTemplate.replaceAll(regex, "\"")
}
}
}
parsedTemplate = JSON.parse(strParsedTemplate)
}
return parsedTemplate
}
export async function getService(request: FastifyRequest<OnlyId>) {
try {
@@ -69,9 +258,17 @@ export async function getService(request: FastifyRequest<OnlyId>) {
if (!service) {
throw { status: 404, message: 'Service not found.' }
}
let template = {}
let tags = []
if (service.type) {
template = await parseAndFindServiceTemplates(service)
tags = await getTags(service.type)
}
return {
settings: await listSettings(),
service
service,
template,
tags
}
} catch ({ status, message }) {
return errorHandler({ status, message })
@@ -80,7 +277,7 @@ export async function getService(request: FastifyRequest<OnlyId>) {
export async function getServiceType(request: FastifyRequest) {
try {
return {
types: supportedServiceTypesAndVersions
services: await getTemplates()
}
} catch ({ status, message }) {
return errorHandler({ status, message })
@@ -90,25 +287,79 @@ export async function saveServiceType(request: FastifyRequest<SaveServiceType>,
try {
const { id } = request.params;
const { type } = request.body;
await configureServiceType({ id, type });
return reply.code(201).send()
} catch ({ status, message }) {
return errorHandler({ status, message })
}
}
export async function getServiceVersions(request: FastifyRequest<OnlyId>) {
try {
const teamId = request.user.teamId;
const { id } = request.params;
const { type } = await getServiceFromDB({ id, teamId });
return {
type,
versions: supportedServiceTypesAndVersions.find((name) => name.name === type).versions
const templates = await getTemplates()
let foundTemplate = templates.find(t => fixType(t.type) === fixType(type))
if (foundTemplate) {
foundTemplate = JSON.parse(JSON.stringify(foundTemplate).replaceAll('$$id', id))
if (foundTemplate.variables.length > 0) {
for (const variable of foundTemplate.variables) {
const { defaultValue } = variable;
const regex = /^\$\$.*\((\d+)\)$/g;
const length = Number(regex.exec(defaultValue)?.[1]) || undefined
if (variable.defaultValue.startsWith('$$generate_password')) {
variable.value = generatePassword({ length });
} else if (variable.defaultValue.startsWith('$$generate_hex')) {
variable.value = generatePassword({ length, isHex: true });
} else if (variable.defaultValue.startsWith('$$generate_username')) {
variable.value = cuid();
} else {
variable.value = variable.defaultValue || '';
}
const foundVariableSomewhereElse = foundTemplate.variables.find(v => v.defaultValue.includes(variable.id))
if (foundVariableSomewhereElse) {
foundVariableSomewhereElse.value = foundVariableSomewhereElse.value.replaceAll(variable.id, variable.value)
}
}
}
for (const variable of foundTemplate.variables) {
if (variable.id.startsWith('$$secret_')) {
const found = await prisma.serviceSecret.findFirst({ where: { name: variable.name, serviceId: id } })
if (!found) {
await prisma.serviceSecret.create({
data: { name: variable.name, value: encrypt(variable.value) || '', service: { connect: { id } } }
})
}
}
if (variable.id.startsWith('$$config_')) {
const found = await prisma.serviceSetting.findFirst({ where: { name: variable.name, serviceId: id } })
if (!found) {
await prisma.serviceSetting.create({
data: { name: variable.name, value: variable.value.toString(), variableName: variable.id, service: { connect: { id } } }
})
}
}
}
for (const service of Object.keys(foundTemplate.services)) {
if (foundTemplate.services[service].volumes) {
for (const volume of foundTemplate.services[service].volumes) {
const [volumeName, path] = volume.split(':')
if (!volumeName.startsWith('/')) {
const found = await prisma.servicePersistentStorage.findFirst({ where: { volumeName, serviceId: id } })
if (!found) {
await prisma.servicePersistentStorage.create({
data: { volumeName, path, containerId: service, predefined: true, service: { connect: { id } } }
});
}
}
}
}
}
await prisma.service.update({ where: { id }, data: { type, version: foundTemplate.defaultVersion, templateVersion: foundTemplate.templateVersion } })
if (type.startsWith('wordpress')) {
await prisma.service.update({ where: { id }, data: { wordpress: { create: {} } } })
}
return reply.code(201).send()
} else {
throw { status: 404, message: 'Service type not found.' }
}
} catch ({ status, message }) {
return errorHandler({ status, message })
}
}
export async function saveServiceVersion(request: FastifyRequest<SaveServiceVersion>, reply: FastifyReply) {
try {
const { id } = request.params;
@@ -155,7 +406,7 @@ export async function getServiceUsage(request: FastifyRequest<OnlyId>) {
}
export async function getServiceLogs(request: FastifyRequest<GetServiceLogs>) {
try {
const { id } = request.params;
const { id, containerId } = request.params;
let { since = 0 } = request.query
if (since !== 0) {
since = day(since).unix();
@@ -166,10 +417,8 @@ export async function getServiceLogs(request: FastifyRequest<GetServiceLogs>) {
});
if (destinationDockerId) {
try {
// const found = await checkContainer({ dockerId, container: id })
// if (found) {
const { default: ansi } = await import('strip-ansi')
const { stdout, stderr } = await executeDockerCmd({ dockerId, command: `docker logs --since ${since} --tail 5000 --timestamps ${id}` })
const { stdout, stderr } = await executeDockerCmd({ dockerId, command: `docker logs --since ${since} --tail 5000 --timestamps ${containerId}` })
const stripLogsStdout = stdout.toString().split('\n').map((l) => ansi(l)).filter((a) => a);
const stripLogsStderr = stderr.toString().split('\n').map((l) => ansi(l)).filter((a) => a);
const logs = stripLogsStderr.concat(stripLogsStdout)
@@ -177,7 +426,10 @@ export async function getServiceLogs(request: FastifyRequest<GetServiceLogs>) {
return { logs: sortedLogs }
// }
} catch (error) {
const { statusCode } = error;
const { statusCode, stderr } = error;
if (stderr.startsWith('Error: No such container')) {
return { logs: [], noContainer: true }
}
if (statusCode === 404) {
return {
logs: []
@@ -227,26 +479,22 @@ export async function checkServiceDomain(request: FastifyRequest<CheckServiceDom
export async function checkService(request: FastifyRequest<CheckService>) {
try {
const { id } = request.params;
let { fqdn, exposePort, forceSave, otherFqdns, dualCerts } = request.body;
let { fqdn, exposePort, forceSave, dualCerts, otherFqdn = false } = request.body;
const domainsList = await prisma.serviceSetting.findMany({ where: { variableName: { startsWith: '$$config_coolify_fqdn' } } })
if (fqdn) fqdn = fqdn.toLowerCase();
if (otherFqdns && otherFqdns.length > 0) otherFqdns = otherFqdns.map((f) => f.toLowerCase());
if (exposePort) exposePort = Number(exposePort);
const { destinationDocker: { remoteIpAddress, remoteEngine, engine }, exposePort: configuredPort } = await prisma.service.findUnique({ where: { id }, include: { destinationDocker: true } })
const { isDNSCheckEnabled } = await prisma.setting.findFirst({});
let found = await isDomainConfigured({ id, fqdn, remoteIpAddress });
let found = await isDomainConfigured({ id, fqdn, remoteIpAddress, checkOwn: otherFqdn });
if (found) {
throw { status: 500, message: `Domain ${getDomain(fqdn).replace('www.', '')} is already in use!` }
}
if (otherFqdns && otherFqdns.length > 0) {
for (const ofqdn of otherFqdns) {
found = await isDomainConfigured({ id, fqdn: ofqdn, remoteIpAddress });
if (found) {
throw { status: 500, message: `Domain ${getDomain(ofqdn).replace('www.', '')} is already in use!` }
}
}
if (domainsList.find(d => getDomain(d.value) === getDomain(fqdn))) {
throw { status: 500, message: `Domain ${getDomain(fqdn).replace('www.', '')} is already in use!` }
}
if (exposePort) await checkExposedPort({ id, configuredPort, exposePort, engine, remoteEngine, remoteIpAddress })
if (isDNSCheckEnabled && !isDev && !forceSave) {
@@ -262,20 +510,33 @@ export async function checkService(request: FastifyRequest<CheckService>) {
export async function saveService(request: FastifyRequest<SaveService>, reply: FastifyReply) {
try {
const { id } = request.params;
let { name, fqdn, exposePort, type } = request.body;
let { name, fqdn, exposePort, type, serviceSetting, version } = request.body;
if (fqdn) fqdn = fqdn.toLowerCase();
if (exposePort) exposePort = Number(exposePort);
type = fixType(type)
const update = saveUpdateableFields(type, request.body[type])
const data = {
fqdn,
name,
exposePort,
version,
}
if (Object.keys(update).length > 0) {
data[type] = { update: update }
const templates = await getTemplates()
const service = await prisma.service.findUnique({ where: { id } })
const foundTemplate = templates.find(t => fixType(t.type) === fixType(service.type))
for (const setting of serviceSetting) {
let { id: settingId, name, value, changed = false, isNew = false, variableName } = setting
if (value) {
if (changed) {
await prisma.serviceSetting.update({ where: { id: settingId }, data: { value } })
}
if (isNew) {
if (!variableName) {
variableName = foundTemplate.variables.find(v => v.name === name).id
}
await prisma.serviceSetting.create({ data: { name, value, variableName, service: { connect: { id } } } })
}
}
}
await prisma.service.update({
where: { id }, data
@@ -289,11 +550,19 @@ export async function saveService(request: FastifyRequest<SaveService>, reply: F
export async function getServiceSecrets(request: FastifyRequest<OnlyId>) {
try {
const { id } = request.params
const teamId = request.user.teamId;
const service = await getServiceFromDB({ id, teamId });
let secrets = await prisma.serviceSecret.findMany({
where: { serviceId: id },
orderBy: { createdAt: 'desc' }
});
const templates = await getTemplates()
const foundTemplate = templates.find(t => fixType(t.type) === service.type)
secrets = secrets.map((secret) => {
const foundVariable = foundTemplate?.variables.find(v => v.name === secret.name) || null
if (foundVariable) {
secret.readOnly = foundVariable.readOnly
}
secret.value = decrypt(secret.value);
return secret;
});
@@ -310,7 +579,6 @@ export async function saveServiceSecret(request: FastifyRequest<SaveServiceSecre
try {
const { id } = request.params
let { name, value, isNew } = request.body
if (isNew) {
const found = await prisma.serviceSecret.findFirst({ where: { name, serviceId: id } });
if (found) {
@@ -369,16 +637,21 @@ export async function getServiceStorages(request: FastifyRequest<OnlyId>) {
export async function saveServiceStorage(request: FastifyRequest<SaveServiceStorage>, reply: FastifyReply) {
try {
const { id } = request.params
const { path, newStorage, storageId } = request.body
const { path, isNewStorage, storageId, containerId } = request.body
if (newStorage) {
if (isNewStorage) {
const volumeName = `${id}-custom${path.replace(/\//gi, '-')}`
const found = await prisma.servicePersistentStorage.findFirst({ where: { path, containerId } });
if (found) {
throw { status: 500, message: 'Persistent storage already exists for this container and path.' }
}
await prisma.servicePersistentStorage.create({
data: { path, service: { connect: { id } } }
data: { path, volumeName, containerId, service: { connect: { id } } }
});
} else {
await prisma.servicePersistentStorage.update({
where: { id: storageId },
data: { path }
data: { path, containerId }
});
}
return reply.code(201).send()
@@ -389,9 +662,8 @@ export async function saveServiceStorage(request: FastifyRequest<SaveServiceStor
export async function deleteServiceStorage(request: FastifyRequest<DeleteServiceStorage>) {
try {
const { id } = request.params
const { path } = request.body
await prisma.servicePersistentStorage.deleteMany({ where: { serviceId: id, path } });
const { storageId } = request.body
await prisma.servicePersistentStorage.deleteMany({ where: { id: storageId } });
return {}
} catch ({ status, message }) {
return errorHandler({ status, message })
@@ -447,14 +719,17 @@ export async function activatePlausibleUsers(request: FastifyRequest<OnlyId>, re
const {
destinationDockerId,
destinationDocker,
plausibleAnalytics: { postgresqlUser, postgresqlPassword, postgresqlDatabase }
serviceSecret
} = await getServiceFromDB({ id, teamId });
if (destinationDockerId) {
await executeDockerCmd({
dockerId: destinationDocker.id,
command: `docker exec ${id} 'psql -H postgresql://${postgresqlUser}:${postgresqlPassword}@localhost:5432/${postgresqlDatabase} -c "UPDATE users SET email_verified = true;"'`
})
return await reply.code(201).send()
const databaseUrl = serviceSecret.find((secret) => secret.name === 'DATABASE_URL');
if (databaseUrl) {
await executeDockerCmd({
dockerId: destinationDocker.id,
command: `docker exec ${id}-postgresql psql -H ${databaseUrl.value} -c "UPDATE users SET email_verified = true;"`
})
return await reply.code(201).send()
}
}
throw { status: 500, message: 'Could not activate users.' }
} catch ({ status, message }) {
@@ -472,7 +747,7 @@ export async function cleanupPlausibleLogs(request: FastifyRequest<OnlyId>, repl
if (destinationDockerId) {
await executeDockerCmd({
dockerId: destinationDocker.id,
command: `docker exec ${id}-clickhouse sh -c "/usr/bin/clickhouse-client -q \\"SELECT name FROM system.tables WHERE name LIKE '%log%';\\"| xargs -I{} /usr/bin/clickhouse-client -q \"TRUNCATE TABLE system.{};\""`
command: `docker exec ${id}-clickhouse /usr/bin/clickhouse-client -q \\"SELECT name FROM system.tables WHERE name LIKE '%log%';\\"| xargs -I{} /usr/bin/clickhouse-client -q \"TRUNCATE TABLE system.{};\"`
})
return await reply.code(201).send()
}
@@ -554,7 +829,7 @@ export async function activateWordpressFtp(request: FastifyRequest<ActivateWordp
});
try {
const isRunning = await checkContainer({ dockerId: destinationDocker.id, container: `${id}-ftp` });
const { found: isRunning } = await checkContainer({ dockerId: destinationDocker.id, container: `${id}-ftp` });
if (isRunning) {
await executeDockerCmd({
dockerId: destinationDocker.id,

View File

@@ -5,6 +5,7 @@ import {
checkService,
checkServiceDomain,
cleanupPlausibleLogs,
cleanupUnconfiguredServices,
deleteService,
deleteServiceSecret,
deleteServiceStorage,
@@ -15,7 +16,6 @@ import {
getServiceStorages,
getServiceType,
getServiceUsage,
getServiceVersions,
listServices,
newService,
saveService,
@@ -30,7 +30,7 @@ import {
import type { OnlyId } from '../../../../types';
import type { ActivateWordpressFtp, CheckService, CheckServiceDomain, DeleteServiceSecret, DeleteServiceStorage, GetServiceLogs, SaveService, SaveServiceDestination, SaveServiceSecret, SaveServiceSettings, SaveServiceStorage, SaveServiceType, SaveServiceVersion, ServiceStartStop, SetGlitchTipSettings, SetWordpressSettings } from './types';
import { startService, stopService } from '../../../../lib/services/handlers';
import { migrateAppwriteDB, startService, stopService } from '../../../../lib/services/handlers';
const root: FastifyPluginAsync = async (fastify): Promise<void> => {
fastify.addHook('onRequest', async (request) => {
@@ -39,6 +39,8 @@ const root: FastifyPluginAsync = async (fastify): Promise<void> => {
fastify.get('/', async (request) => await listServices(request));
fastify.post('/new', async (request, reply) => await newService(request, reply));
fastify.post<any>('/cleanup/unconfigured', async (request) => await cleanupUnconfiguredServices(request));
fastify.get<OnlyId>('/:id', async (request) => await getService(request));
fastify.post<SaveService>('/:id', async (request, reply) => await saveService(request, reply));
fastify.delete<OnlyId>('/:id', async (request) => await deleteService(request));
@@ -61,21 +63,22 @@ const root: FastifyPluginAsync = async (fastify): Promise<void> => {
fastify.get('/:id/configuration/type', async (request) => await getServiceType(request));
fastify.post<SaveServiceType>('/:id/configuration/type', async (request, reply) => await saveServiceType(request, reply));
fastify.get<OnlyId>('/:id/configuration/version', async (request) => await getServiceVersions(request));
fastify.post<SaveServiceVersion>('/:id/configuration/version', async (request, reply) => await saveServiceVersion(request, reply));
fastify.post<SaveServiceDestination>('/:id/configuration/destination', async (request, reply) => await saveServiceDestination(request, reply));
fastify.get<OnlyId>('/:id/usage', async (request) => await getServiceUsage(request));
fastify.get<GetServiceLogs>('/:id/logs', async (request) => await getServiceLogs(request));
fastify.get<GetServiceLogs>('/:id/logs/:containerId', async (request) => await getServiceLogs(request));
fastify.post<ServiceStartStop>('/:id/:type/start', async (request) => await startService(request));
fastify.post<ServiceStartStop>('/:id/:type/stop', async (request) => await stopService(request));
fastify.post<ServiceStartStop>('/:id/start', async (request) => await startService(request, fastify));
fastify.post<ServiceStartStop>('/:id/stop', async (request) => await stopService(request));
fastify.post<ServiceStartStop & SetWordpressSettings & SetGlitchTipSettings>('/:id/:type/settings', async (request, reply) => await setSettingsService(request, reply));
fastify.post<OnlyId>('/:id/plausibleanalytics/activate', async (request, reply) => await activatePlausibleUsers(request, reply));
fastify.post<OnlyId>('/:id/plausibleanalytics/cleanup', async (request, reply) => await cleanupPlausibleLogs(request, reply));
fastify.post<ActivateWordpressFtp>('/:id/wordpress/ftp', async (request, reply) => await activateWordpressFtp(request, reply));
fastify.post<OnlyId>('/:id/appwrite/migrate', async (request, reply) => await migrateAppwriteDB(request, reply));
};
export default root;

View File

@@ -15,9 +15,13 @@ export interface SaveServiceDestination extends OnlyId {
destinationId: string
}
}
export interface GetServiceLogs extends OnlyId {
export interface GetServiceLogs{
Params: {
id: string,
containerId: string
},
Querystring: {
since: number
since: number,
}
}
export interface SaveServiceSettings extends OnlyId {
@@ -36,7 +40,7 @@ export interface CheckService extends OnlyId {
forceSave: boolean,
dualCerts: boolean,
exposePort: number,
otherFqdns: Array<string>
otherFqdn: boolean
}
}
export interface SaveService extends OnlyId {
@@ -44,6 +48,8 @@ export interface SaveService extends OnlyId {
name: string,
fqdn: string,
exposePort: number,
version: string,
serviceSetting: any
type: string
}
}
@@ -62,14 +68,15 @@ export interface DeleteServiceSecret extends OnlyId {
export interface SaveServiceStorage extends OnlyId {
Body: {
path: string,
newStorage: string,
containerId: string,
storageId: string,
isNewStorage: boolean,
}
}
export interface DeleteServiceStorage extends OnlyId {
Body: {
path: string,
storageId: string,
}
}
export interface ServiceStartStop {

View File

@@ -1,8 +1,9 @@
import { promises as dns } from 'dns';
import { X509Certificate } from 'node:crypto';
import type { FastifyReply, FastifyRequest } from 'fastify';
import { checkDomainsIsValidInDNS, decrypt, encrypt, errorHandler, getDomain, isDNSValid, isDomainConfigured, listSettings, prisma } from '../../../../lib/common';
import { CheckDNS, CheckDomain, DeleteDomain, DeleteSSHKey, SaveSettings, SaveSSHKey } from './types';
import { asyncExecShell, checkDomainsIsValidInDNS, decrypt, encrypt, errorHandler, isDev, isDNSValid, isDomainConfigured, listSettings, prisma } from '../../../../lib/common';
import { CheckDNS, CheckDomain, DeleteDomain, OnlyIdInBody, SaveSettings, SaveSSHKey } from './types';
export async function listAllSettings(request: FastifyRequest) {
@@ -16,8 +17,16 @@ export async function listAllSettings(request: FastifyRequest) {
unencryptedKeys.push({ id: key.id, name: key.name, privateKey: decrypt(key.privateKey), createdAt: key.createdAt })
}
}
const certificates = await prisma.certificate.findMany({ where: { team: { id: teamId } } })
let cns = [];
for (const certificate of certificates) {
const x509 = new X509Certificate(certificate.cert);
cns.push({ commonName: x509.subject.split('\n').find((s) => s.startsWith('CN=')).replace('CN=', ''), id: certificate.id, createdAt: certificate.createdAt })
}
return {
settings,
certificates: cns,
sshKeys: unencryptedKeys
}
} catch ({ status, message }) {
@@ -35,16 +44,18 @@ export async function saveSettings(request: FastifyRequest<SaveSettings>, reply:
maxPort,
isAutoUpdateEnabled,
isDNSCheckEnabled,
DNSServers
DNSServers,
proxyDefaultRedirect
} = request.body
const { id } = await listSettings();
await prisma.setting.update({
where: { id },
data: { isRegistrationEnabled, dualCerts, isAutoUpdateEnabled, isDNSCheckEnabled, DNSServers, isAPIDebuggingEnabled }
data: { isRegistrationEnabled, dualCerts, isAutoUpdateEnabled, isDNSCheckEnabled, DNSServers, isAPIDebuggingEnabled, }
});
if (fqdn) {
await prisma.setting.update({ where: { id }, data: { fqdn } });
}
await prisma.setting.update({ where: { id }, data: { proxyDefaultRedirect } });
if (minPort && maxPort) {
await prisma.setting.update({ where: { id }, data: { minPort, maxPort } });
}
@@ -82,7 +93,7 @@ export async function checkDomain(request: FastifyRequest<CheckDomain>) {
if (found) {
throw "Domain already configured";
}
if (isDNSCheckEnabled && !forceSave) {
if (isDNSCheckEnabled && !forceSave && !isDev) {
const hostname = request.hostname.split(':')[0]
return await checkDomainsIsValidInDNS({ hostname, fqdn, dualCerts });
}
@@ -118,7 +129,7 @@ export async function saveSSHKey(request: FastifyRequest<SaveSSHKey>, reply: Fas
return errorHandler({ status, message })
}
}
export async function deleteSSHKey(request: FastifyRequest<DeleteSSHKey>, reply: FastifyReply) {
export async function deleteSSHKey(request: FastifyRequest<OnlyIdInBody>, reply: FastifyReply) {
try {
const { id } = request.body;
await prisma.sshKey.delete({ where: { id } })
@@ -126,4 +137,15 @@ export async function deleteSSHKey(request: FastifyRequest<DeleteSSHKey>, reply:
} catch ({ status, message }) {
return errorHandler({ status, message })
}
}
export async function deleteCertificates(request: FastifyRequest<OnlyIdInBody>, reply: FastifyReply) {
try {
const { id } = request.body;
await asyncExecShell(`docker exec coolify-proxy sh -c 'rm -f /etc/traefik/acme/custom/${id}-key.pem /etc/traefik/acme/custom/${id}-cert.pem'`)
await prisma.certificate.delete({ where: { id } })
return reply.code(201).send()
} catch ({ status, message }) {
return errorHandler({ status, message })
}
}

View File

@@ -1,21 +1,59 @@
import { FastifyPluginAsync } from 'fastify';
import { checkDNS, checkDomain, deleteDomain, deleteSSHKey, listAllSettings, saveSettings, saveSSHKey } from './handlers';
import { CheckDNS, CheckDomain, DeleteDomain, DeleteSSHKey, SaveSettings, SaveSSHKey } from './types';
import { X509Certificate } from 'node:crypto';
import { encrypt, errorHandler, prisma } from '../../../../lib/common';
import { checkDNS, checkDomain, deleteCertificates, deleteDomain, deleteSSHKey, listAllSettings, saveSettings, saveSSHKey } from './handlers';
import { CheckDNS, CheckDomain, DeleteDomain, OnlyIdInBody, SaveSettings, SaveSSHKey } from './types';
const root: FastifyPluginAsync = async (fastify): Promise<void> => {
fastify.addHook('onRequest', async (request) => {
return await request.jwtVerify()
})
fastify.get('/', async (request) => await listAllSettings(request));
fastify.post<SaveSettings>('/', async (request, reply) => await saveSettings(request, reply));
fastify.delete<DeleteDomain>('/', async (request, reply) => await deleteDomain(request, reply));
fastify.addHook('onRequest', async (request) => {
return await request.jwtVerify()
})
fastify.get('/', async (request) => await listAllSettings(request));
fastify.post<SaveSettings>('/', async (request, reply) => await saveSettings(request, reply));
fastify.delete<DeleteDomain>('/', async (request, reply) => await deleteDomain(request, reply));
fastify.get<CheckDNS>('/check', async (request) => await checkDNS(request));
fastify.post<CheckDomain>('/check', async (request) => await checkDomain(request));
fastify.get<CheckDNS>('/check', async (request) => await checkDNS(request));
fastify.post<CheckDomain>('/check', async (request) => await checkDomain(request));
fastify.post<SaveSSHKey>('/sshKey', async (request, reply) => await saveSSHKey(request, reply));
fastify.delete<DeleteSSHKey>('/sshKey', async (request, reply) => await deleteSSHKey(request, reply));
fastify.post<SaveSSHKey>('/sshKey', async (request, reply) => await saveSSHKey(request, reply));
fastify.delete<OnlyIdInBody>('/sshKey', async (request, reply) => await deleteSSHKey(request, reply));
fastify.post('/upload', async (request) => {
try {
const teamId = request.user.teamId;
const certificates = await prisma.certificate.findMany({})
let cns = [];
for (const certificate of certificates) {
const x509 = new X509Certificate(certificate.cert);
cns.push(x509.subject.split('\n').find((s) => s.startsWith('CN=')).replace('CN=', ''))
}
const parts = await request.files()
let key = null
let cert = null
for await (const part of parts) {
const name = part.fieldname
if (name === 'key') key = (await part.toBuffer()).toString()
if (name === 'cert') cert = (await part.toBuffer()).toString()
}
const x509 = new X509Certificate(cert);
const cn = x509.subject.split('\n').find((s) => s.startsWith('CN=')).replace('CN=', '')
if (cns.includes(cn)) {
throw {
message: `A certificate with ${cn} common name already exists.`
}
}
await prisma.certificate.create({ data: { cert, key: encrypt(key), team: { connect: { id: teamId } } } })
await prisma.applicationSettings.updateMany({ where: { application: { AND: [{ fqdn: { endsWith: cn } }, { fqdn: { startsWith: 'https' } }] } }, data: { isCustomSSL: true } })
return { message: 'Certificated uploaded' }
} catch ({ status, message }) {
return errorHandler({ status, message });
}
});
fastify.delete<OnlyIdInBody>('/certificate', async (request, reply) => await deleteCertificates(request, reply))
// fastify.get('/certificates', async (request) => await getCertificates(request))
};
export default root;

View File

@@ -10,7 +10,8 @@ export interface SaveSettings {
maxPort: number,
isAutoUpdateEnabled: boolean,
isDNSCheckEnabled: boolean,
DNSServers: string
DNSServers: string,
proxyDefaultRedirect: string
}
}
export interface DeleteDomain {
@@ -41,4 +42,9 @@ export interface DeleteSSHKey {
Body: {
id: string
}
}
export interface OnlyIdInBody {
Body: {
id: string
}
}

View File

@@ -9,7 +9,7 @@ export async function listSources(request: FastifyRequest) {
try {
const teamId = request.user?.teamId;
const sources = await prisma.gitSource.findMany({
where: { teams: { some: { id: teamId === '0' ? undefined : teamId } } },
where: { OR: [{ teams: { some: { id: teamId === "0" ? undefined : teamId } } }, { isSystemWide: true }] },
include: { teams: true, githubApp: true, gitlabApp: true }
});
return {
@@ -22,11 +22,11 @@ export async function listSources(request: FastifyRequest) {
export async function saveSource(request, reply) {
try {
const { id } = request.params
let { name, htmlUrl, apiUrl, customPort } = request.body
let { name, htmlUrl, apiUrl, customPort, isSystemWide } = request.body
if (customPort) customPort = Number(customPort)
await prisma.gitSource.update({
where: { id },
data: { name, htmlUrl, apiUrl, customPort }
data: { name, htmlUrl, apiUrl, customPort, isSystemWide }
});
return reply.code(201).send()
} catch ({ status, message }) {
@@ -56,7 +56,7 @@ export async function getSource(request: FastifyRequest<OnlyId>) {
}
const source = await prisma.gitSource.findFirst({
where: { id, teams: { some: { id: teamId === '0' ? undefined : teamId } } },
where: { id, OR: [{ teams: { some: { id: teamId === "0" ? undefined : teamId } } }, { isSystemWide: true }] },
include: { githubApp: true, gitlabApp: true }
});
if (!source) {
@@ -104,7 +104,7 @@ export async function saveGitHubSource(request: FastifyRequest<SaveGitHubSource>
const { teamId } = request.user
const { id } = request.params
let { name, htmlUrl, apiUrl, organization, customPort } = request.body
let { name, htmlUrl, apiUrl, organization, customPort, isSystemWide } = request.body
if (customPort) customPort = Number(customPort)
if (id === 'new') {
@@ -117,6 +117,7 @@ export async function saveGitHubSource(request: FastifyRequest<SaveGitHubSource>
apiUrl,
organization,
customPort,
isSystemWide,
type: 'github',
teams: { connect: { id: teamId } }
}

View File

@@ -7,6 +7,7 @@ export interface SaveGitHubSource extends OnlyId {
apiUrl: string,
organization: string,
customPort: number,
isSystemWide: boolean
}
}
export interface SaveGitLabSource extends OnlyId {

View File

@@ -1,7 +1,6 @@
import axios from "axios";
import cuid from "cuid";
import crypto from "crypto";
import { encrypt, errorHandler, getUIUrl, isDev, prisma } from "../../../lib/common";
import { encrypt, errorHandler, getDomain, getUIUrl, isDev, prisma } from "../../../lib/common";
import { checkContainer, removeContainer } from "../../../lib/docker";
import { createdBranchDatabase, getApplicationFromDBWebhook, removeBranchDatabase } from "../../api/v1/applications/handlers";
@@ -32,13 +31,14 @@ export async function installGithub(request: FastifyRequest<InstallGithub>, repl
}
export async function configureGitHubApp(request, reply) {
try {
const { default: got } = await import('got')
const { code, state } = request.query;
const { apiUrl } = await prisma.gitSource.findFirst({
where: { id: state },
include: { githubApp: true, gitlabApp: true }
});
const { data }: any = await axios.post(`${apiUrl}/app-manifests/${code}/conversions`);
const data: any = await got.post(`${apiUrl}/app-manifests/${code}/conversions`).json()
const { id, client_id, slug, client_secret, pem, webhook_secret } = data
const encryptedClientSecret = encrypt(client_secret);
@@ -66,13 +66,19 @@ export async function configureGitHubApp(request, reply) {
}
export async function gitHubEvents(request: FastifyRequest<GitHubEvents>): Promise<any> {
try {
const allowedGithubEvents = ['push', 'pull_request'];
const allowedGithubEvents = ['push', 'pull_request', 'ping', 'installation'];
const allowedActions = ['opened', 'reopened', 'synchronize', 'closed'];
const githubEvent = request.headers['x-github-event']?.toString().toLowerCase();
const githubSignature = request.headers['x-hub-signature-256']?.toString().toLowerCase();
if (!allowedGithubEvents.includes(githubEvent)) {
throw { status: 500, message: 'Event not allowed.' }
}
if (githubEvent === 'ping') {
return { pong: 'cool' }
}
if (githubEvent === 'installation') {
return { status: 'cool' }
}
let projectId, branch;
const body = request.body
if (githubEvent === 'push') {
@@ -80,7 +86,7 @@ export async function gitHubEvents(request: FastifyRequest<GitHubEvents>): Promi
branch = body.ref.includes('/') ? body.ref.split('/')[2] : body.ref;
} else if (githubEvent === 'pull_request') {
projectId = body.pull_request.base.repo.id;
branch = body.pull_request.base.ref.includes('/') ? body.pull_request.base.ref.split('/')[2] : body.pull_request.base.ref;
branch = body.pull_request.base.ref
}
if (!projectId || !branch) {
throw { status: 500, message: 'Cannot parse projectId or branch from the webhook?!' }
@@ -147,14 +153,15 @@ export async function gitHubEvents(request: FastifyRequest<GitHubEvents>): Promi
} else if (githubEvent === 'pull_request') {
const pullmergeRequestId = body.number.toString();
const pullmergeRequestAction = body.action;
const sourceBranch = body.pull_request.head.ref.includes('/') ? body.pull_request.head.ref.split('/')[2] : body.pull_request.head.ref;
const sourceBranch = body.pull_request.head.ref
const sourceRepository = body.pull_request.head.repo.full_name
if (!allowedActions.includes(pullmergeRequestAction)) {
throw { status: 500, message: 'Action not allowed.' }
}
if (application.settings.previews) {
if (application.destinationDockerId) {
const isRunning = await checkContainer(
const { found: isRunning } = await checkContainer(
{
dockerId: application.destinationDocker.id,
container: application.id
@@ -169,10 +176,29 @@ export async function gitHubEvents(request: FastifyRequest<GitHubEvents>): Promi
pullmergeRequestAction === 'reopened' ||
pullmergeRequestAction === 'synchronize'
) {
await prisma.application.update({
where: { id: application.id },
data: { updatedAt: new Date() }
});
let previewApplicationId = undefined
if (pullmergeRequestId) {
const foundPreviewApplications = await prisma.previewApplication.findMany({ where: { applicationId: application.id, pullmergeRequestId } })
if (foundPreviewApplications.length > 0) {
previewApplicationId = foundPreviewApplications[0].id
} else {
const protocol = application.fqdn.includes('https://') ? 'https://' : 'http://'
const previewApplication = await prisma.previewApplication.create({
data: {
pullmergeRequestId,
sourceBranch,
customDomain: `${protocol}${pullmergeRequestId}.${getDomain(application.fqdn)}`,
application: { connect: { id: application.id } }
}
})
previewApplicationId = previewApplication.id
}
}
// if (application.connectedDatabase && pullmergeRequestAction === 'opened' || pullmergeRequestAction === 'reopened') {
// // Coolify hosted database
// if (application.connectedDatabase.databaseId) {
@@ -186,7 +212,9 @@ export async function gitHubEvents(request: FastifyRequest<GitHubEvents>): Promi
await prisma.build.create({
data: {
id: buildId,
sourceRepository,
pullmergeRequestId,
previewApplicationId,
sourceBranch,
applicationId: application.id,
destinationDockerId: application.destinationDocker.id,
@@ -198,7 +226,9 @@ export async function gitHubEvents(request: FastifyRequest<GitHubEvents>): Promi
}
});
return {
message: 'Queued. Thank you!'
};
} else if (pullmergeRequestAction === 'closed') {
if (application.destinationDockerId) {
const id = `${application.id}-${pullmergeRequestId}`;
@@ -206,13 +236,22 @@ export async function gitHubEvents(request: FastifyRequest<GitHubEvents>): Promi
await removeContainer({ id, dockerId: application.destinationDocker.id });
} catch (error) { }
}
if (application.connectedDatabase.databaseId) {
const databaseId = application.connectedDatabase.databaseId;
const database = await prisma.database.findUnique({ where: { id: databaseId } });
if (database) {
await removeBranchDatabase(database, pullmergeRequestId);
const foundPreviewApplications = await prisma.previewApplication.findMany({ where: { applicationId: application.id, pullmergeRequestId } })
if (foundPreviewApplications.length > 0) {
for (const preview of foundPreviewApplications) {
await prisma.previewApplication.delete({ where: { id: preview.id } })
}
}
return {
message: 'PR closed. Thank you!'
};
// if (application?.connectedDatabase?.databaseId) {
// const databaseId = application.connectedDatabase.databaseId;
// const database = await prisma.database.findUnique({ where: { id: databaseId } });
// if (database) {
// await removeBranchDatabase(database, pullmergeRequestId);
// }
// }
}
}
}

View File

@@ -23,6 +23,7 @@ export interface GitHubEvents {
ref: string,
repo: {
id: string,
full_name: string,
}
}
}

View File

@@ -1,8 +1,7 @@
import axios from "axios";
import cuid from "cuid";
import crypto from "crypto";
import type { FastifyReply, FastifyRequest } from "fastify";
import { errorHandler, getAPIUrl, getUIUrl, isDev, listSettings, prisma } from "../../../lib/common";
import { errorHandler, getAPIUrl, getDomain, getUIUrl, isDev, listSettings, prisma } from "../../../lib/common";
import { checkContainer, removeContainer } from "../../../lib/docker";
import { getApplicationFromDB, getApplicationFromDBWebhook } from "../../api/v1/applications/handlers";
@@ -10,6 +9,7 @@ import type { ConfigureGitLabApp, GitLabEvents } from "./types";
export async function configureGitLabApp(request: FastifyRequest<ConfigureGitLabApp>, reply: FastifyReply) {
try {
const { default: got } = await import('got')
const { code, state } = request.query;
const { fqdn } = await listSettings();
const { gitSource: { gitlabApp: { appId, appSecret }, htmlUrl } }: any = await getApplicationFromDB(state, undefined);
@@ -19,19 +19,21 @@ export async function configureGitLabApp(request: FastifyRequest<ConfigureGitLab
if (isDev) {
domain = getAPIUrl();
}
const params = new URLSearchParams({
client_id: appId,
client_secret: appSecret,
code,
state,
grant_type: 'authorization_code',
redirect_uri: `${domain}/webhooks/gitlab`
});
const { data } = await axios.post(`${htmlUrl}/oauth/token`, params)
const { access_token } = await got.post(`${htmlUrl}/oauth/token`, {
searchParams: {
client_id: appId,
client_secret: appSecret,
code,
state,
grant_type: 'authorization_code',
redirect_uri: `${domain}/webhooks/gitlab`
}
}).json()
if (isDev) {
return reply.redirect(`${getUIUrl()}/webhooks/success?token=${data.access_token}`)
return reply.redirect(`${getUIUrl()}/webhooks/success?token=${access_token}`)
}
return reply.redirect(`/webhooks/success?token=${data.access_token}`)
return reply.redirect(`/webhooks/success?token=${access_token}`)
} catch ({ status, message, ...other }) {
return errorHandler({ status, message })
}
@@ -39,9 +41,7 @@ export async function configureGitLabApp(request: FastifyRequest<ConfigureGitLab
export async function gitLabEvents(request: FastifyRequest<GitLabEvents>) {
const { object_kind: objectKind, ref, project_id } = request.body
try {
const allowedActions = ['opened', 'reopen', 'close', 'open', 'update'];
const webhookToken = request.headers['x-gitlab-token'];
if (!webhookToken && !isDev) {
throw { status: 500, message: 'Invalid webhookToken.' }
@@ -91,8 +91,8 @@ export async function gitLabEvents(request: FastifyRequest<GitLabEvents>) {
}
}
} else if (objectKind === 'merge_request') {
const { object_attributes: { work_in_progress: isDraft, action, source_branch: sourceBranch, target_branch: targetBranch, iid: pullmergeRequestId }, project: { id } } = request.body
const { object_attributes: { work_in_progress: isDraft, action, source_branch: sourceBranch, target_branch: targetBranch, source: { path_with_namespace: sourceRepository } }, project: { id } } = request.body
const pullmergeRequestId = request.body.object_attributes.iid.toString();
const projectId = Number(id);
if (!allowedActions.includes(action)) {
throw { status: 500, message: 'Action not allowed.' }
@@ -100,14 +100,13 @@ export async function gitLabEvents(request: FastifyRequest<GitLabEvents>) {
if (isDraft) {
throw { status: 500, message: 'Draft MR, do nothing.' }
}
const applicationsFound = await getApplicationFromDBWebhook(projectId, targetBranch);
if (applicationsFound && applicationsFound.length > 0) {
for (const application of applicationsFound) {
const buildId = cuid();
if (application.settings.previews) {
if (application.destinationDockerId) {
const isRunning = await checkContainer(
const { found: isRunning } = await checkContainer(
{
dockerId: application.destinationDocker.id,
container: application.id
@@ -130,10 +129,30 @@ export async function gitLabEvents(request: FastifyRequest<GitLabEvents>) {
where: { id: application.id },
data: { updatedAt: new Date() }
});
let previewApplicationId = undefined
if (pullmergeRequestId) {
const foundPreviewApplications = await prisma.previewApplication.findMany({ where: { applicationId: application.id, pullmergeRequestId } })
if (foundPreviewApplications.length > 0) {
previewApplicationId = foundPreviewApplications[0].id
} else {
const protocol = application.fqdn.includes('https://') ? 'https://' : 'http://'
const previewApplication = await prisma.previewApplication.create({
data: {
pullmergeRequestId,
sourceBranch,
customDomain: `${protocol}${pullmergeRequestId}.${getDomain(application.fqdn)}`,
application: { connect: { id: application.id } }
}
})
previewApplicationId = previewApplication.id
}
}
await prisma.build.create({
data: {
id: buildId,
pullmergeRequestId: pullmergeRequestId.toString(),
pullmergeRequestId,
previewApplicationId,
sourceRepository,
sourceBranch,
applicationId: application.id,
destinationDockerId: application.destinationDocker.id,
@@ -150,8 +169,19 @@ export async function gitLabEvents(request: FastifyRequest<GitLabEvents>) {
} else if (action === 'close') {
if (application.destinationDockerId) {
const id = `${application.id}-${pullmergeRequestId}`;
await removeContainer({ id, dockerId: application.destinationDocker.id });
try {
await removeContainer({ id, dockerId: application.destinationDocker.id });
} catch (error) { }
}
const foundPreviewApplications = await prisma.previewApplication.findMany({ where: { applicationId: application.id, pullmergeRequestId } })
if (foundPreviewApplications.length > 0) {
for (const preview of foundPreviewApplications) {
await prisma.previewApplication.delete({ where: { id: preview.id } })
}
}
return {
message: 'MR closed. Thank you!'
};
}
}

View File

@@ -8,6 +8,9 @@ export interface GitLabEvents {
Body: {
object_attributes: {
work_in_progress: string
source: {
path_with_namespace: string
}
isDraft: string
action: string
source_branch: string

File diff suppressed because it is too large Load Diff

View File

@@ -1,13 +1,12 @@
import { FastifyPluginAsync } from 'fastify';
import { OnlyId } from '../../../types';
import { remoteTraefikConfiguration, traefikConfiguration, traefikOtherConfiguration } from './handlers';
import { TraefikOtherConfiguration } from './types';
import { proxyConfiguration, otherProxyConfiguration } from './handlers';
import { OtherProxyConfiguration } from './types';
const root: FastifyPluginAsync = async (fastify): Promise<void> => {
fastify.get('/main.json', async (request, reply) => traefikConfiguration(request, reply));
fastify.get<TraefikOtherConfiguration>('/other.json', async (request, reply) => traefikOtherConfiguration(request));
fastify.get<OnlyId>('/remote/:id', async (request) => remoteTraefikConfiguration(request));
fastify.get<OnlyId>('/main.json', async (request, reply) => proxyConfiguration(request, false));
fastify.get<OnlyId>('/remote/:id', async (request) => proxyConfiguration(request, true));
fastify.get<OtherProxyConfiguration>('/other.json', async (request, reply) => otherProxyConfiguration(request));
};
export default root;

View File

@@ -1,4 +1,4 @@
export interface TraefikOtherConfiguration {
export interface OtherProxyConfiguration {
Querystring: {
id: string,
privatePort: number,

View File

@@ -1,4 +1,4 @@
export interface OnlyId {
Params: { id: string },
Params: { id?: string },
}

1
apps/api/tags.json Normal file

File diff suppressed because one or more lines are too long

1
apps/api/templates.json Normal file

File diff suppressed because one or more lines are too long

View File

@@ -14,40 +14,43 @@
"format": "prettier --write --plugin-search-dir=. ."
},
"devDependencies": {
"@floating-ui/dom": "1.0.1",
"@playwright/test": "1.25.1",
"@floating-ui/dom": "1.0.3",
"@playwright/test": "1.27.1",
"@popperjs/core": "2.11.6",
"@sveltejs/kit": "1.0.0-next.405",
"@types/js-cookie": "3.0.2",
"@typescript-eslint/eslint-plugin": "5.36.1",
"@typescript-eslint/parser": "5.36.1",
"autoprefixer": "10.4.8",
"classnames": "2.3.1",
"eslint": "8.23.0",
"@typescript-eslint/eslint-plugin": "5.41.0",
"@typescript-eslint/parser": "5.41.0",
"autoprefixer": "10.4.12",
"classnames": "2.3.2",
"eslint": "8.26.0",
"eslint-config-prettier": "8.5.0",
"eslint-plugin-svelte3": "4.0.0",
"flowbite": "1.5.2",
"flowbite-svelte": "0.26.2",
"postcss": "8.4.16",
"flowbite": "1.5.3",
"flowbite-svelte": "0.27.11",
"postcss": "8.4.18",
"prettier": "2.7.1",
"prettier-plugin-svelte": "2.7.0",
"svelte": "3.50.0",
"svelte-check": "2.9.0",
"prettier-plugin-svelte": "2.8.0",
"svelte": "3.52.0",
"svelte-check": "2.9.2",
"svelte-preprocess": "4.10.7",
"tailwindcss": "3.1.8",
"tailwindcss": "3.2.1",
"tailwindcss-scrollbar": "0.1.0",
"tslib": "2.4.0",
"typescript": "4.8.2",
"vite": "3.1.0"
"typescript": "4.8.4",
"vite": "3.2.0"
},
"type": "module",
"dependencies": {
"@sveltejs/adapter-static": "1.0.0-next.39",
"@tailwindcss/typography": "^0.5.7",
"@sveltejs/adapter-static": "1.0.0-next.46",
"@tailwindcss/typography": "0.5.7",
"cuid": "2.1.8",
"daisyui": "2.24.2",
"daisyui": "2.33.0",
"dayjs": "1.11.6",
"js-cookie": "3.0.1",
"js-yaml": "4.1.0",
"p-limit": "4.0.0",
"socket.io-client": "4.5.3",
"svelte-select": "4.4.7",
"sveltekit-i18n": "2.2.2"
}

View File

@@ -3,33 +3,35 @@ import Cookies from 'js-cookie';
export function getAPIUrl() {
if (GITPOD_WORKSPACE_URL) {
const { href } = new URL(GITPOD_WORKSPACE_URL)
const newURL = href.replace('https://', 'https://3001-').replace(/\/$/, '')
return newURL
const { href } = new URL(GITPOD_WORKSPACE_URL);
const newURL = href.replace('https://', 'https://3001-').replace(/\/$/, '');
return newURL;
}
if (CODESANDBOX_HOST) {
return `https://${CODESANDBOX_HOST.replace(/\$PORT/,'3001')}`
return `https://${CODESANDBOX_HOST.replace(/\$PORT/, '3001')}`;
}
return dev ? 'http://localhost:3001' : 'http://localhost:3000';
return dev
? 'http://localhost:3001'
: 'http://localhost:3000';
}
export function getWebhookUrl(type: string) {
if (GITPOD_WORKSPACE_URL) {
const { href } = new URL(GITPOD_WORKSPACE_URL)
const newURL = href.replace('https://', 'https://3001-').replace(/\/$/, '')
const { href } = new URL(GITPOD_WORKSPACE_URL);
const newURL = href.replace('https://', 'https://3001-').replace(/\/$/, '');
if (type === 'github') {
return `${newURL}/webhooks/github/events`
return `${newURL}/webhooks/github/events`;
}
if (type === 'gitlab') {
return `${newURL}/webhooks/gitlab/events`
return `${newURL}/webhooks/gitlab/events`;
}
}
if (CODESANDBOX_HOST) {
const newURL = `https://${CODESANDBOX_HOST.replace(/\$PORT/,'3001')}`
const newURL = `https://${CODESANDBOX_HOST.replace(/\$PORT/, '3001')}`;
if (type === 'github') {
return `${newURL}/webhooks/github/events`
return `${newURL}/webhooks/github/events`;
}
if (type === 'gitlab') {
return `${newURL}/webhooks/gitlab/events`
return `${newURL}/webhooks/gitlab/events`;
}
}
return `https://webhook.site/0e5beb2c-4e9b-40e2-a89e-32295e570c21/events`;
@@ -37,7 +39,7 @@ export function getWebhookUrl(type: string) {
async function send({
method,
path,
data = {},
data = null,
headers,
timeout = 120000
}: {
@@ -51,7 +53,7 @@ async function send({
const controller = new AbortController();
const id = setTimeout(() => controller.abort(), timeout);
const opts: any = { method, headers: {}, body: null, signal: controller.signal };
if (Object.keys(data).length > 0) {
if (data && Object.keys(data).length > 0) {
const parsedData = data;
for (const [key, value] of Object.entries(data)) {
if (value === '') {
@@ -83,7 +85,9 @@ async function send({
if (dev && !path.startsWith('https://')) {
path = `${getAPIUrl()}${path}`;
}
if (method === 'POST' && data && !opts.body) {
opts.body = data;
}
const response = await fetch(`${path}`, opts);
clearTimeout(id);
@@ -103,7 +107,11 @@ async function send({
return {};
}
if (!response.ok) {
if (response.status === 401 && !path.startsWith('https://api.github') && !path.includes('/v4/user')) {
if (
response.status === 401 &&
!path.startsWith('https://api.github') &&
!path.includes('/v4/')
) {
Cookies.remove('token');
}
@@ -126,7 +134,7 @@ export function del(
export function post(
path: string,
data: Record<string, unknown>,
data: Record<string, unknown> | FormData,
headers?: Record<string, unknown>
): Promise<Record<string, any>> {
return send({ method: 'POST', path, data, headers });

View File

@@ -3,7 +3,7 @@ import { addToast } from '$lib/store';
export const asyncSleep = (delay: number) =>
new Promise((resolve) => setTimeout(resolve, delay));
export function errorNotification(error: any): void {
export function errorNotification(error: any | { message: string }): void {
if (error.message) {
if (error.message === 'Cannot read properties of undefined (reading \'postMessage\')') {
return addToast({
@@ -83,4 +83,8 @@ export function handlerNotFoundLoad(error: any, url: URL) {
status: 500,
error: new Error(`Could not load ${url}`)
};
}
export function getRndInteger(min: number, max: number) {
return Math.floor(Math.random() * (max - min + 1)) + min;
}

View File

@@ -0,0 +1 @@
<span class="badge bg-coollabs-gradient rounded text-white font-normal"> BETA </span>

View File

@@ -13,8 +13,9 @@
export let id: string;
export let name: string;
export let placeholder = '';
export let inputStyle = '';
let disabledClass = 'bg-coolback disabled:bg-coolblack';
let disabledClass = 'bg-coolback disabled:bg-coolblack w-full';
let isHttps = browser && window.location.protocol === 'https:';
function copyToClipboard() {
@@ -32,10 +33,13 @@
{#if !isPasswordField || showPassword}
{#if textarea}
<textarea
style={inputStyle}
rows="5"
class={disabledClass}
class:pr-10={true}
class:pr-20={value && isHttps}
class:border={required && !value}
class:border-red-500={required && !value}
{placeholder}
type="text"
{id}
@@ -47,10 +51,13 @@
>
{:else}
<input
style={inputStyle}
class={disabledClass}
type="text"
class:pr-10={true}
class:pr-20={value && isHttps}
class:border={required && !value}
class:border-red-500={required && !value}
{id}
{name}
{required}
@@ -63,9 +70,12 @@
{/if}
{:else}
<input
style={inputStyle}
class={disabledClass}
class:pr-10={true}
class:pr-20={value && isHttps}
class:border={required && !value}
class:border-red-500={required && !value}
type="password"
{id}
{name}
@@ -78,9 +88,10 @@
/>
{/if}
<div class="absolute top-0 right-0 m-3 cursor-pointer text-stone-600 hover:text-white">
<div class="absolute top-0 right-0 flex justify-center items-center h-full cursor-pointer text-stone-600 hover:text-white mr-3">
<div class="flex space-x-2">
{#if isPasswordField}
<!-- svelte-ignore a11y-click-events-have-key-events -->
<div on:click={() => (showPassword = !showPassword)}>
{#if showPassword}
<svg
@@ -122,6 +133,7 @@
</div>
{/if}
{#if value && isHttps}
<!-- svelte-ignore a11y-click-events-have-key-events -->
<div on:click={copyToClipboard}>
<svg
xmlns="http://www.w3.org/2000/svg"

View File

@@ -1,6 +1,9 @@
<script lang="ts">
import ExternalLink from './ExternalLink.svelte';
import Tooltip from './Tooltip.svelte';
export let url = 'https://docs.coollabs.io';
export let text: any = '';
export let isExternal = false;
let id =
'cool-' +
url
@@ -10,23 +13,32 @@
.slice(-16);
</script>
<a {id} href={url} target="_blank" class="icons inline-block text-pink-500 cursor-pointer text-xs">
<a
{id}
href={url}
target="_blank noreferrer"
class="flex no-underline inline-block cursor-pointer"
class:icons={!text}
>
<svg
xmlns="http://www.w3.org/2000/svg"
class="w-6 h-6"
fill="none"
viewBox="0 0 24 24"
stroke-width="1.5"
stroke="currentColor"
fill="none"
stroke-linecap="round"
stroke-linejoin="round"
class="w-6 h-6"
>
<path stroke="none" d="M0 0h24v24H0z" fill="none" />
<path
d="M6 4h11a2 2 0 0 1 2 2v12a2 2 0 0 1 -2 2h-11a1 1 0 0 1 -1 -1v-14a1 1 0 0 1 1 -1m3 0v18"
stroke-linecap="round"
stroke-linejoin="round"
d="M9.879 7.519c1.171-1.025 3.071-1.025 4.242 0 1.172 1.025 1.172 2.687 0 3.712-.203.179-.43.326-.67.442-.745.361-1.45.999-1.45 1.827v.75M21 12a9 9 0 11-18 0 9 9 0 0118 0zm-9 5.25h.008v.008H12v-.008z"
/>
<line x1="13" y1="8" x2="15" y2="8" />
<line x1="13" y1="12" x2="15" y2="12" />
</svg>
{text}
{#if isExternal}
<ExternalLink />
{/if}
</a>
<Tooltip triggeredBy={`#${id}`}>See details in the documentation</Tooltip>
{#if !text}
<Tooltip triggeredBy={`#${id}`}>See details in the documentation</Tooltip>
{/if}

View File

@@ -1,31 +1,38 @@
<script lang="ts">
import { onMount } from 'svelte';
// import { onMount } from 'svelte';
import Tooltip from './Tooltip.svelte';
// import Tooltip from './Tooltip.svelte';
export let explanation = '';
let id: any;
let self: any;
onMount(() => {
id = `info-${self.offsetLeft}-${self.offsetTop}`;
});
export let position = 'dropdown-right';
// let id: any;
// let self: any;
// onMount(() => {
// id = `info-${self.offsetLeft}-${self.offsetTop}`;
// });
</script>
<div {id} class="inline-block mx-2 text-pink-500 cursor-pointer" bind:this={self}>
<svg
fill="none"
height="18"
shape-rendering="geometricPrecision"
stroke="currentColor"
stroke-linecap="round"
stroke-linejoin="round"
stroke-width="1.5"
viewBox="0 0 24 24"
width="18"
><path d="M12 22c5.523 0 10-4.477 10-10S17.523 2 12 2 2 6.477 2 12s4.477 10 10 10z" /><path
d="M9.09 9a3 3 0 015.83 1c0 2-3 3-3 3"
/><circle cx="12" cy="17" r=".5" />
</svg>
<div class={`dropdown dropdown-end ${position}`}>
<!-- svelte-ignore a11y-label-has-associated-control -->
<!-- svelte-ignore a11y-no-noninteractive-tabindex -->
<label tabindex="0" class="btn btn-circle btn-ghost btn-xs text-sky-500">
<svg
xmlns="http://www.w3.org/2000/svg"
fill="none"
viewBox="0 0 24 24"
class="w-4 h-4 stroke-current"
><path
stroke-linecap="round"
stroke-linejoin="round"
stroke-width="2"
d="M13 16h-1v-4h-1m1-4h.01M21 12a9 9 0 11-18 0 9 9 0 0118 0z"
/></svg
>
</label>
<!-- svelte-ignore a11y-no-noninteractive-tabindex -->
<div tabindex="0" class="card compact dropdown-content shadow bg-coolgray-400 rounded w-64">
<div class="card-body">
<!-- <h2 class="card-title">You needed more info?</h2> -->
<p class="text-xs font-normal">{@html explanation}</p>
</div>
</div>
</div>
{#if id}
<Tooltip triggeredBy={`#${id}`}>{@html explanation}</Tooltip>
{/if}

View File

@@ -0,0 +1,10 @@
<svg
xmlns="http://www.w3.org/2000/svg"
fill="currentColor"
viewBox="0 0 24 24"
stroke-width="3"
stroke="currentColor"
class="w-3 h-3 text-white"
>
<path stroke-linecap="round" stroke-linejoin="round" d="M4.5 19.5l15-15m0 0H8.25m11.25 0v11.25" />
</svg>

After

Width:  |  Height:  |  Size: 261 B

View File

@@ -0,0 +1,37 @@
<script lang="ts">
export let id: any;
import { status } from '$lib/store';
let serviceStatus = {
isExcluded: false,
isExited: false,
isRunning: false,
isRestarting: false,
isStopped: false
};
$: if (Object.keys($status.service.statuses).length > 0 && $status.service.statuses[id]?.status) {
let { isExited, isRunning, isRestarting, isExcluded } = $status.service.statuses[id].status;
serviceStatus.isExited = isExited;
serviceStatus.isRunning = isRunning;
serviceStatus.isExcluded = isExcluded;
serviceStatus.isRestarting = isRestarting;
serviceStatus.isStopped = !isExited && !isRunning && !isRestarting;
} else {
serviceStatus.isExited = false;
serviceStatus.isRunning = false;
serviceStatus.isExcluded = false;
serviceStatus.isRestarting = false;
serviceStatus.isStopped = true;
}
</script>
{#if serviceStatus.isExcluded}
<span class="badge font-bold uppercase rounded text-orange-500 mt-2">Excluded</span>
{:else if serviceStatus.isRunning}
<span class="badge font-bold uppercase rounded text-green-500 mt-2">Running</span>
{:else if serviceStatus.isStopped || serviceStatus.isExited}
<span class="badge font-bold uppercase rounded text-red-500 mt-2">Stopped</span>
{:else if serviceStatus.isRestarting}
<span class="badge font-bold uppercase rounded text-yellow-500 mt-2">Restarting</span>
{/if}

View File

@@ -1,11 +1,14 @@
<script lang="ts">
import Beta from './Beta.svelte';
import Explaner from './Explainer.svelte';
import Tooltip from './Tooltip.svelte';
export let id: any;
export let customClass: any = null;
export let setting: any;
export let title: any;
export let description: any;
export let isBeta: any = false;
export let description: any = null;
export let isCenter = true;
export let disabled = false;
export let dataTooltip: any = null;
@@ -15,12 +18,20 @@
<div class="flex items-center py-4 pr-8">
<div class="flex w-96 flex-col">
<div class="text-xs font-bold text-stone-100 md:text-base">
{title}<Explaner explanation={description} />
</div>
<!-- svelte-ignore a11y-label-has-associated-control -->
<label>
{title}
{#if isBeta}
<Beta />
{/if}
{#if description && description !== ''}
<Explaner explanation={description} />
{/if}
</label>
</div>
</div>
<div class:text-center={isCenter} class="flex justify-center">
<div class:text-center={isCenter} class={`flex justify-center ${customClass}`}>
<!-- svelte-ignore a11y-click-events-have-key-events -->
<div
on:click
aria-pressed="false"

View File

@@ -2,16 +2,21 @@
import { createEventDispatcher } from 'svelte';
const dispatch = createEventDispatcher();
export let type = 'info';
function success() {
if (type === 'success') {
return 'bg-dark lg:bg-primary';
}
}
</script>
<!-- svelte-ignore a11y-click-events-have-key-events -->
<div
on:click={() => dispatch('click')}
on:mouseover={() => dispatch('pause')}
on:focus={() => dispatch('pause')}
on:mouseout={() => dispatch('resume')}
on:blur={() => dispatch('resume')}
class="alert shadow-lg text-white rounded hover:scale-105 transition-all duration-100 cursor-pointer"
class:bg-coollabs={type === 'success'}
class={` flex flex-row justify-center alert shadow-lg text-white hover:scale-105 transition-all duration-100 cursor-pointer rounded ${success()}`}
class:alert-error={type === 'error'}
class:alert-info={type === 'info'}
>

View File

@@ -1,13 +1,12 @@
<script lang="ts">
import { fade } from 'svelte/transition';
import Toast from './Toast.svelte';
import { dismissToast, pauseToast, resumeToast, toasts } from '$lib/store';
</script>
{#if $toasts}
{#if $toasts.length > 0}
<section>
<article class="toast toast-top toast-end rounded-none" role="alert" transition:fade>
<article class="toast toast-top toast-center rounded-none w-2/3 lg:w-[20rem]" role="alert">
{#each $toasts as toast (toast.id)}
<Toast
type={toast.type}

View File

@@ -1,8 +1,10 @@
<script lang="ts">
import { Tooltip } from 'flowbite-svelte';
export let placement = 'bottom';
export let color = 'bg-coollabs text-left';
export let color = 'bg-coollabs';
export let triggeredBy = '#tooltip-default';
</script>
<Tooltip {triggeredBy} {placement} arrow={false} {color} style="custom"><slot /></Tooltip>
<Tooltip {triggeredBy} {placement} arrow={false} defaultClass={color + ' font-thin text-xs text-left border-none p-2'} style="custom"
><slot /></Tooltip
>

View File

@@ -1,12 +1,11 @@
<script lang="ts">
import { dev } from '$app/env';
import { get, post } from '$lib/api';
import { addToast, appSession, features } from '$lib/store';
import { addToast, appSession, features, updateLoading, isUpdateAvailable } from '$lib/store';
import { asyncSleep, errorNotification } from '$lib/common';
import { onMount } from 'svelte';
import Tooltip from './Tooltip.svelte';
let isUpdateAvailable = false;
let updateStatus: any = {
found: false,
loading: false,
@@ -58,37 +57,41 @@
if ($appSession.userId) {
const overrideVersion = $features.latestVersion;
if ($appSession.teamId === '0') {
if ($updateLoading === true) return;
try {
$updateLoading = true;
const data = await get(`/update`);
if (overrideVersion || data?.isUpdateAvailable) {
latestVersion = overrideVersion || data.latestVersion;
if (overrideVersion) {
isUpdateAvailable = true;
$isUpdateAvailable = true;
} else {
isUpdateAvailable = data.isUpdateAvailable;
$isUpdateAvailable = data.isUpdateAvailable;
}
}
} catch (error) {
return errorNotification(error);
} finally {
$updateLoading = false;
}
}
}
});
</script>
<div class="py-2">
<div class="py-0 lg:py-2">
{#if $appSession.teamId === '0'}
{#if isUpdateAvailable}
{#if $isUpdateAvailable}
<button
id="update"
disabled={updateStatus.success === false}
on:click={update}
class="icons bg-gradient-to-r from-purple-500 via-pink-500 to-red-500 text-white duration-75 hover:scale-105"
class="icons bg-coollabs-gradient text-white duration-75 hover:scale-105 w-full"
>
{#if updateStatus.loading}
<svg
xmlns="http://www.w3.org/2000/svg"
class="lds-heart h-9 w-8"
class="lds-heart h-8 w-8"
viewBox="0 0 24 24"
stroke-width="1.5"
stroke="currentColor"
@@ -102,24 +105,27 @@
/>
</svg>
{:else if updateStatus.success === null}
<svg
xmlns="http://www.w3.org/2000/svg"
class="h-9 w-8"
viewBox="0 0 24 24"
stroke-width="1.5"
stroke="currentColor"
fill="none"
stroke-linecap="round"
stroke-linejoin="round"
>
<path stroke="none" d="M0 0h24v24H0z" fill="none" />
<circle cx="12" cy="12" r="9" />
<line x1="12" y1="8" x2="8" y2="12" />
<line x1="12" y1="8" x2="12" y2="16" />
<line x1="16" y1="12" x2="12" y2="8" />
</svg>
<div class="flex items-center justify-center space-x-2">
<svg
xmlns="http://www.w3.org/2000/svg"
class="h-8 w-8"
viewBox="0 0 24 24"
stroke-width="1.5"
stroke="currentColor"
fill="none"
stroke-linecap="round"
stroke-linejoin="round"
>
<path stroke="none" d="M0 0h24v24H0z" fill="none" />
<circle cx="12" cy="12" r="9" />
<line x1="12" y1="8" x2="8" y2="12" />
<line x1="12" y1="8" x2="12" y2="16" />
<line x1="16" y1="12" x2="12" y2="8" />
</svg>
<span class="flex lg:hidden">Update available</span>
</div>
{:else if updateStatus.success}
<svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 36 36" class="h-9 w-8"
<svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 36 36" class="h-8 w-8"
><path
fill="#DD2E44"
d="M11.626 7.488c-.112.112-.197.247-.268.395l-.008-.008L.134 33.141l.011.011c-.208.403.14 1.223.853 1.937.713.713 1.533 1.061 1.936.853l.01.01L28.21 24.735l-.008-.009c.147-.07.282-.155.395-.269 1.562-1.562-.971-6.627-5.656-11.313-4.687-4.686-9.752-7.218-11.315-5.656z"
@@ -184,7 +190,9 @@
>
{/if}
</button>
<Tooltip triggeredBy="#update" placement="right" color="bg-gradient-to-r from-purple-500 via-pink-500 to-red-500">New Version Available!</Tooltip>
<Tooltip triggeredBy="#update" placement="right" color="bg-coolgray-200 text-white"
>New Version Available!</Tooltip
>
{/if}
{/if}
</div>

Some files were not shown because too many files have changed in this diff Show More