CI and local dev environment improvements

The need for this became very noticeable due to the slowness of filesystem access in docker-in-mac, with a full compile taking over a minute for me in docker. Using make to introduce incremental compile makes this near instantaneous outside of docker (if only a few files have changed), and quick enough inside docker.

With incremental compile via make, it compiles quickly enough that re-compiling and restarting the web service automatically when backend files change is quick enough now. This is how the service is run via docker-compose in https://github.com/sharelatex/sharelatex-dev-environment, so it shouldn't be necessary to manually restart the container each time a coffee file changes.

At the moment Jenkins pull web modules in via the GitSCM plugin, but I believe this is creating a dependency in Jenkins, where any commits to any of the modules causes all of the web branches to rebuild. By doing it via our own scripts we can hopefully avoid this. It also creates a build process which is reproducible locally.

**Note that at the moment in this PR all modules pull from `ja-dockerize-dev` branches, but these should be merged first, and this PR updated to point to the master branches before merging**. This is necessary for other changes to build process/docker-compose workflow.

As well as a Makefile for web, there is now a `Makefile.module`. This is copied into each module directory by the top-level Makefile, and is written in a way to be flexible and support unit tests, acceptance tests, front-end js for the ide and main, and the modules `app/coffee` directory, while allowing modules to have some of these missing (not all modules have e.g. acceptance tests, or front-end JS). This will allows us to refine the build process in future, without needing to update the Makefile in each module repo separately (I found this to be a painful part of this development).

This makes web compatible with the docker-compose workflow at https://github.com/sharelatex/sharelatex-dev-environment, where each service is running in its own docker container, with networking managed by docker.

Previously the Makefile was set up to run unit tests in docker with `make unit_tests`. This now just runs them natively. In the CI, they are run in docker anyway (all steps in Jenkins are), and locally, they run fine natively with `npm run test:unit`, or can be run in docker via https://github.com/sharelatex/sharelatex-dev-environment with `bin/run web_sl npm run test:unit`.

Previously we did a lot of juggling with only mounting source files (coffee, less, etc) into the docker container for acceptance tests. This was to avoid creating root owned files if the whole directory was mounted. Now instead the whole web directory is mounted read-only, with the compilation step done outside of the container before running the tests.

This allows the host and container to share the `node_modules` folder as well, which avoids needing to `npm install` twice on the CI box, and should speed up the build by a few minutes.

On macOS, this would cause a problem with compiled modules if you tried to use the same `node_modules` to run the app natively. However, if running via docker-compose in https://github.com/sharelatex/sharelatex-dev-environment, this is no longer a problem.
This commit is contained in:
James Allen 2017-12-28 20:11:27 +00:00
parent 62bc3f947f
commit 662122bb1c
23 changed files with 365 additions and 979 deletions

View file

@ -80,3 +80,5 @@ app/views/external
/modules/
docker-shared.yml
config/*.coffee

View file

@ -14,38 +14,12 @@ pipeline {
}
stages {
stage('Set up') {
agent {
docker {
image 'node:6.9.5'
reuseNode true
}
}
stage('Install modules') {
steps {
// we need to disable logallrefupdates, else git clones during the npm install will require git to lookup the user id
// which does not exist in the container's /etc/passwd file, causing the clone to fail.
sh 'git config --global core.logallrefupdates false'
sh 'rm -rf node_modules/*'
sshagent (credentials: ['GIT_DEPLOY_KEY']) {
sh 'bin/install_modules'
}
}
stage('Clone Dependencies') {
steps {
sh 'rm -rf public/brand modules'
checkout([$class: 'GitSCM', branches: [[name: '*/master']], extensions: [[$class: 'RelativeTargetDirectory', relativeTargetDir: 'public/brand'], [$class: 'CloneOption', shallow: true]], userRemoteConfigs: [[credentialsId: 'GIT_DEPLOY_KEY', url: 'git@github.com:sharelatex/brand-sharelatex']]])
checkout([$class: 'GitSCM', branches: [[name: '*/master']], extensions: [[$class: 'RelativeTargetDirectory', relativeTargetDir: 'app/views/external'], [$class: 'CloneOption', shallow: true]], userRemoteConfigs: [[credentialsId: 'GIT_DEPLOY_KEY', url: 'git@github.com:sharelatex/external-pages-sharelatex']]])
checkout([$class: 'GitSCM', branches: [[name: '*/master']], extensions: [[$class: 'RelativeTargetDirectory', relativeTargetDir: 'modules'], [$class: 'CloneOption', shallow: true]], userRemoteConfigs: [[credentialsId: 'GIT_DEPLOY_KEY', url: 'git@github.com:sharelatex/web-sharelatex-modules']]])
checkout([$class: 'GitSCM', branches: [[name: '*/master']], extensions: [[$class: 'RelativeTargetDirectory', relativeTargetDir: 'modules/admin-panel'], [$class: 'CloneOption', shallow: true]], userRemoteConfigs: [[credentialsId: 'GIT_DEPLOY_KEY', url: 'git@github.com:sharelatex/admin-panel']]])
checkout([$class: 'GitSCM', branches: [[name: '*/master']], extensions: [[$class: 'RelativeTargetDirectory', relativeTargetDir: 'modules/groovehq'], [$class: 'CloneOption', shallow: true]], userRemoteConfigs: [[credentialsId: 'GIT_DEPLOY_KEY', url: 'git@bitbucket.org:sharelatex/groovehq']]])
checkout([$class: 'GitSCM', branches: [[name: '*/master']], extensions: [[$class: 'RelativeTargetDirectory', relativeTargetDir: 'modules/references-search'], [$class: 'CloneOption', shallow: true]], userRemoteConfigs: [[credentialsId: 'GIT_DEPLOY_KEY', url: 'git@bitbucket.org:sharelatex/references-search.git']]])
checkout([$class: 'GitSCM', branches: [[name: '*/master']], extensions: [[$class: 'RelativeTargetDirectory', relativeTargetDir: 'modules/tpr-webmodule'], [$class: 'CloneOption', shallow: true]], userRemoteConfigs: [[credentialsId: 'GIT_DEPLOY_KEY', url: 'git@github.com:sharelatex/tpr-webmodule.git ']]])
checkout([$class: 'GitSCM', branches: [[name: '*/master']], extensions: [[$class: 'RelativeTargetDirectory', relativeTargetDir: 'modules/learn-wiki'], [$class: 'CloneOption', shallow: true]], userRemoteConfigs: [[credentialsId: 'GIT_DEPLOY_KEY', url: 'git@bitbucket.org:sharelatex/learn-wiki-web-module.git']]])
checkout([$class: 'GitSCM', branches: [[name: '*/master']], extensions: [[$class: 'RelativeTargetDirectory', relativeTargetDir: 'modules/templates'], [$class: 'CloneOption', shallow: true]], userRemoteConfigs: [[credentialsId: 'GIT_DEPLOY_KEY', url: 'git@github.com:sharelatex/templates-webmodule.git']]])
checkout([$class: 'GitSCM', branches: [[name: '*/master']], extensions: [[$class: 'RelativeTargetDirectory', relativeTargetDir: 'modules/track-changes'], [$class: 'CloneOption', shallow: true]], userRemoteConfigs: [[credentialsId: 'GIT_DEPLOY_KEY', url: 'git@github.com:sharelatex/track-changes-web-module.git']]])
checkout([$class: 'GitSCM', branches: [[name: '*/master']], extensions: [[$class: 'RelativeTargetDirectory', relativeTargetDir: 'modules/overleaf-integration'], [$class: 'CloneOption', shallow: true]], userRemoteConfigs: [[credentialsId: 'GIT_DEPLOY_KEY', url: 'git@github.com:sharelatex/overleaf-integration-web-module.git']]])
checkout([$class: 'GitSCM', branches: [[name: '*/master']], extensions: [[$class: 'RelativeTargetDirectory', relativeTargetDir: 'modules/overleaf-account-merge'], [$class: 'CloneOption', shallow: true]], userRemoteConfigs: [[credentialsId: 'GIT_DEPLOY_KEY', url: 'git@github.com:sharelatex/overleaf-account-merge.git']]])
}
}
stage('Install') {
@ -58,22 +32,12 @@ pipeline {
}
steps {
sh 'git config --global core.logallrefupdates false'
sh 'mv app/views/external/robots.txt public/robots.txt'
sh 'mv app/views/external/googlebdb0f8f7f4a17241.html public/googlebdb0f8f7f4a17241.html'
sh 'npm --quiet install'
sh 'rm -rf node_modules/'
sh 'npm install --quiet'
sh 'npm rebuild'
// It's too easy to end up shrinkwrapping to an outdated version of translations.
// Ensure translations are always latest, regardless of shrinkwrap
sh 'npm install git+https://github.com/sharelatex/translations-sharelatex.git#master'
sh 'npm install --quiet grunt'
sh 'npm install --quiet grunt-cli'
sh 'ls -l node_modules/.bin'
}
}
stage('Test') {
steps {
sh 'make ci'
}
}
@ -85,13 +49,13 @@ pipeline {
}
}
steps {
sh 'node_modules/.bin/grunt compile compile:tests --verbose'
sh 'make compile_full'
// replace the build number placeholder for sentry
sh 'node_modules/.bin/grunt version'
}
}
stage('Smoke Test') {
stage('Unit Test') {
agent {
docker {
image 'node:6.9.5'
@ -99,7 +63,14 @@ pipeline {
}
}
steps {
sh 'node_modules/.bin/grunt compile:smoke_tests'
sh 'make --no-print-directory test_unit test_frontend MOCHA_ARGS="--reporter tap"'
}
}
stage('Acceptance Test') {
steps {
// Spawns its own docker containers
sh 'make --no-print-directory test_acceptance MOCHA_ARGS="--reporter tap"'
}
}
@ -111,7 +82,7 @@ pipeline {
}
}
steps {
sh 'node_modules/.bin/grunt compile:minify'
sh 'make minify'
}
}
@ -149,7 +120,7 @@ pipeline {
post {
always {
sh 'make ci_clean'
sh 'make clean_ci'
}
failure {

View file

@ -1,29 +1,166 @@
DOCKER_COMPOSE_FLAGS ?= -f docker-compose.yml
NPM := docker-compose ${DOCKER_COMPOSE_FLAGS} run --rm npm npm -q
BUILD_NUMBER ?= local
BRANCH_NAME ?= $(shell git rev-parse --abbrev-ref HEAD)
PROJECT_NAME = web
all: install test
@echo "Run:"
@echo " make install to set up the project dependencies (in docker)"
@echo " make test to run all the tests for the project (in docker)"
MODULE_DIRS := $(shell find modules -mindepth 1 -maxdepth 1 -type d -not -name '.git' )
MODULE_MAKEFILES := $(MODULE_DIRS:=/Makefile)
COFFEE := node_modules/.bin/coffee
GRUNT := node_modules/.bin/grunt
APP_COFFEE_FILES := $(shell find app/coffee -name '*.coffee')
FRONT_END_COFFEE_FILES := $(shell find public/coffee -name '*.coffee')
TEST_COFFEE_FILES := $(shell find test -name '*.coffee')
MODULE_MAIN_COFFEE_FILES := $(shell find modules -type f -wholename '*main/index.coffee')
MODULE_IDE_COFFEE_FILES := $(shell find modules -type f -wholename '*ide/index.coffee')
COFFEE_FILES := app.coffee $(APP_COFFEE_FILES) $(FRONT_END_COFFEE_FILES) $(TEST_COFFEE_FILES)
JS_FILES := $(subst coffee,js,$(COFFEE_FILES))
SHAREJS_COFFEE_FILES := \
public/coffee/ide/editor/sharejs/header.coffee \
public/coffee/ide/editor/sharejs/vendor/types/helpers.coffee \
public/coffee/ide/editor/sharejs/vendor/types/text.coffee \
public/coffee/ide/editor/sharejs/vendor/types/text-api.coffee \
public/coffee/ide/editor/sharejs/vendor/client/microevent.coffee \
public/coffee/ide/editor/sharejs/vendor/client/doc.coffee \
public/coffee/ide/editor/sharejs/vendor/client/ace.coffee
LESS_FILES := $(shell find public/stylesheets -name '*.less')
CSS_FILES := public/stylesheets/style.css public/stylesheets/ol-style.css
add: docker-shared.yml
$(NPM) install --save ${P}
app.js: app.coffee
$(COFFEE) --compile --print $< > $@
add_dev: docker-shared.yml
$(NPM) install --save-dev ${P}
app/js/%.js: app/coffee/%.coffee
@mkdir -p $(dir $@)
$(COFFEE) --compile --print $< > $@
install: docker-shared.yml
bin/generate_volumes_file
$(NPM) install
public/js/%.js: public/coffee/%.coffee
@mkdir -p $(dir $@)
$(COFFEE) --output $(dir $@) --map --compile $<
clean: ci_clean
test/unit/js/%.js: test/unit/coffee/%.coffee
@mkdir -p $(dir $@)
$(COFFEE) --compile --print $< > $@
test/acceptance/js/%.js: test/acceptance/coffee/%.coffee
@mkdir -p $(dir $@)
$(COFFEE) --compile --print $< > $@
test/unit_frontend/js/%.js: test/unit_frontend/coffee/%.coffee
@mkdir -p $(dir $@)
$(COFFEE) --compile --print $< > $@
test/smoke/js/%.js: test/smoke/coffee/%.coffee
@mkdir -p $(dir $@)
$(COFFEE) --compile --print $< > $@
public/js/libs/sharejs.js: $(SHAREJS_COFFEE_FILES)
@echo "Compiling public/js/libs/sharejs.js"
@echo 'define(["ace/ace"], function() {' > public/js/libs/sharejs.js
@cat $(SHAREJS_COFFEE_FILES) | $(COFFEE) --stdio --print >> public/js/libs/sharejs.js
@echo "" >> public/js/libs/sharejs.js
@echo "return window.sharejs; });" >> public/js/libs/sharejs.js
public/js/ide.js: public/coffee/ide.coffee $(MODULE_IDE_COFFEE_FILES)
@echo Compiling and injecting module includes into public/js/ide.js
@INCLUDES=""; \
for dir in modules/*; \
do \
MODULE=`echo $$dir | cut -d/ -f2`; \
if [ -e $$dir/public/coffee/ide/index.coffee ]; then \
INCLUDES="\"ide/$$MODULE/index\",$$INCLUDES"; \
fi \
done; \
INCLUDES=$${INCLUDES%?}; \
$(COFFEE) --compile --print $< | \
sed -e s=\"__IDE_CLIENTSIDE_INCLUDES__\"=$$INCLUDES= \
> $@
public/js/main.js: public/coffee/main.coffee $(MODULE_MAIN_COFFEE_FILES)
@echo Compiling and injecting module includes into public/js/main.js
@INCLUDES=""; \
for dir in modules/*; \
do \
MODULE=`echo $$dir | cut -d/ -f2`; \
if [ -e $$dir/public/coffee/main/index.coffee ]; then \
INCLUDES="\"main/$$MODULE/index\",$$INCLUDES"; \
fi \
done; \
INCLUDES=$${INCLUDES%?}; \
$(COFFEE) --compile --print $< | \
sed -e s=\"__MAIN_CLIENTSIDE_INCLUDES__\"=$$INCLUDES= \
> $@
$(CSS_FILES): $(LESS_FILES)
$(GRUNT) compile:css
minify: $(CSS_FILES) $(JS_FILES)
$(GRUNT) compile:minify
css: $(CSS_FILES)
compile: $(JS_FILES) css public/js/libs/sharejs.js public/js/main.js public/js/ide.js
@$(MAKE) compile_modules
compile_full:
$(COFFEE) -c -p app.coffee > app.js
$(COFFEE) -o app/js -c app/coffee
$(COFFEE) -o public/js -c public/coffee
$(COFFEE) -o test/acceptance/js -c test/acceptance/coffee
$(COFFEE) -o test/smoke/js -c test/smoke/coffee
$(COFFEE) -o test/unit/js -c test/unit/coffee
$(COFFEE) -o test/unit_frontend/js -c test/unit_frontend/coffee
rm -f public/js/ide.js public/js/main.js # We need to generate ide.js, main.js manually later
$(MAKE) $(CSS_FILES)
$(MAKE) compile_modules_full
$(MAKE) compile # ide.js, main.js, share.js, and anything missed
compile_modules: $(MODULE_MAKEFILES)
@set -e; \
for dir in $(MODULE_DIRS); \
do \
if [ -e $$dir/Makefile ]; then \
(cd $$dir && $(MAKE) compile); \
fi; \
if [ ! -e $$dir/Makefile ]; then \
echo "No makefile found in $$dir"; \
fi; \
done
compile_modules_full: $(MODULE_MAKEFILES)
@set -e; \
for dir in $(MODULE_DIRS); \
do \
if [ -e $$dir/Makefile ]; then \
echo "Compiling $$dir in full"; \
(cd $$dir && $(MAKE) compile_full); \
fi; \
if [ ! -e $$dir/Makefile ]; then \
echo "No makefile found in $$dir"; \
fi; \
done
$(MODULE_MAKEFILES): Makefile.module
@set -e; \
for makefile in $(MODULE_MAKEFILES); \
do \
cp Makefile.module $$makefile; \
done
clean: clean_app clean_frontend clean_css clean_tests clean_modules
clean_app:
rm -f app.js
rm -rf app/js
clean_frontend:
rm -rf public/js/{analytics,directives,filters,ide,main,modules,services,utils}
rm -f public/js/*.js
rm -f public/js/libs/sharejs.js
clean_tests:
rm -rf test/unit/js
rm -rf test/acceptance/js
clean_modules:
for dir in modules/*; \
do \
rm -f $$dir/index.js; \
@ -32,53 +169,54 @@ clean: ci_clean
rm -rf $$dir/test/acceptance/js; \
done
ci_clean:
# Deletes node_modules volume
docker-compose down --volumes
clean_css:
rm -f public/stylesheets/*.css*
# Need regenerating if you change the web modules you have installed
docker-shared.yml:
bin/generate_volumes_file
clean_ci:
docker-compose down
test: test_unit test_frontend test_acceptance
test_unit: docker-shared.yml
docker-compose ${DOCKER_COMPOSE_FLAGS} run --rm test_unit npm -q run test:unit -- ${MOCHA_ARGS}
test_unit:
npm -q run test:unit -- ${MOCHA_ARGS}
test_frontend: docker-shared.yml
docker-compose ${DOCKER_COMPOSE_FLAGS} run --rm test_unit npm -q run test:frontend -- ${MOCHA_ARGS}
test_frontend:
npm -q run test:frontend -- ${MOCHA_ARGS}
test_acceptance: test_acceptance_app test_acceptance_modules
test_acceptance_app: test_acceptance_app_start_service test_acceptance_app_run test_acceptance_app_stop_service
test_acceptance_app: test_acceptance_app_start_service test_acceptance_app_run
$(MAKE) test_acceptance_app_stop_service
test_acceptance_app_start_service: test_acceptance_app_stop_service docker-shared.yml
test_acceptance_app_start_service: test_acceptance_app_stop_service
$(MAKE) compile
docker-compose ${DOCKER_COMPOSE_FLAGS} up -d test_acceptance
test_acceptance_app_stop_service: docker-shared.yml
test_acceptance_app_stop_service:
docker-compose ${DOCKER_COMPOSE_FLAGS} stop test_acceptance redis mongo
test_acceptance_app_run: docker-shared.yml
test_acceptance_app_run:
docker-compose ${DOCKER_COMPOSE_FLAGS} exec -T test_acceptance npm -q run test:acceptance -- ${MOCHA_ARGS}
test_acceptance_modules: docker-shared.yml
# Break and error on any module failure
set -e; \
for dir in modules/*; \
test_acceptance_modules:
@set -e; \
for dir in $(MODULE_DIRS); \
do \
if [ -e $$dir/Makefile ]; then \
(make test_acceptance_module MODULE=$$dir) \
fi \
if [ -e $$dir/test/acceptance ]; then \
$(MAKE) test_acceptance_module MODULE=$$dir; \
fi; \
done
test_acceptance_module: docker-shared.yml
cd $(MODULE) && make test_acceptance
test_acceptance_module: $(MODULE_MAKEFILES)
@if [ -e $(MODULE)/test/acceptance ]; then \
cd $(MODULE) && $(MAKE) test_acceptance; \
fi
ci:
MOCHA_ARGS="--reporter tap" \
$(MAKE) install test
$(MAKE) test
.PHONY:
all add install update test test_unit test_frontend test_acceptance \
test_acceptance_start_service test_acceptance_stop_service \
test_acceptance_run ci ci_clean
test_acceptance_run ci ci_clean compile clean css

View file

@ -0,0 +1,61 @@
MODULE_NAME := $(notdir $(shell pwd))
MODULE_DIR := modules/$(MODULE_NAME)
COFFEE := ../../node_modules/.bin/coffee
APP_COFFEE_FILES := $(shell find app/coffee -name '*.coffee') \
$(shell [ -e test/unit/coffee ] && find test/unit/coffee -name '*.coffee') \
$(shell [ -e test/acceptance/coffee ] && find test/acceptance/coffee -name '*.coffee')
APP_JS_FILES := $(subst coffee,js,$(APP_COFFEE_FILES))
IDE_COFFEE_FILES := $(shell [ -e public/coffee/ide ] && find public/coffee/ide -name '*.coffee')
IDE_JS_FILES := $(subst public/coffee/ide,../../public/js/ide/$(MODULE_NAME),$(IDE_COFFEE_FILES))
IDE_JS_FILES := $(subst coffee,js,$(IDE_JS_FILES))
MAIN_COFFEE_FILES := $(shell [ -e public/coffee/main ] && find public/coffee/main -name '*.coffee')
MAIN_JS_FILES := $(subst public/coffee/main,../../public/js/main/$(MODULE_NAME),$(MAIN_COFFEE_FILES))
MAIN_JS_FILES := $(subst coffee,js,$(MAIN_JS_FILES))
DOCKER_COMPOSE_FLAGS := -f $(MODULE_DIR)/docker-compose.yml
DOCKER_COMPOSE := cd ../../ && MODULE_DIR=$(MODULE_DIR) docker-compose -f docker-compose.yml ${DOCKER_COMPOSE_FLAGS}
app/js/%.js: app/coffee/%.coffee
@mkdir -p $(dir $@)
$(COFFEE) --compile --print $< > $@
test/unit/js/%.js: test/unit/coffee/%.coffee
@mkdir -p $(dir $@)
$(COFFEE) --compile --print $< > $@
test/acceptance/js/%.js: test/acceptance/coffee/%.coffee
@mkdir -p $(dir $@)
$(COFFEE) --compile --print $< > $@
../../public/js/ide/$(MODULE_NAME)/%.js: public/coffee/ide/%.coffee
@mkdir -p $(dir $@)
$(COFFEE) --compile --print $< > $@
../../public/js/main/$(MODULE_NAME)/%.js: public/coffee/main/%.coffee
@mkdir -p $(dir $@)
$(COFFEE) --compile --print $< > $@
index.js: index.coffee
$(COFFEE) --compile --print $< > $@
compile: $(APP_JS_FILES) $(IDE_JS_FILES) $(MAIN_JS_FILES) index.js
@echo > /dev/null
compile_full:
$(COFFEE) -o app/js -c app/coffee
if [ -e test/unit/coffee ]; then $(COFFEE) -o test/unit/js -c test/unit/coffee; fi
if [ -e test/acceptance/coffee ]; then $(COFFEE) -o test/acceptance/js -c test/acceptance/coffee; fi
if [ -e public/coffee/ide/ ]; then $(COFFEE) -o ../../public/js/ide/$(MODULE_NAME) -c public/coffee/ide/; fi
if [ -e public/coffee/main/ ]; then $(COFFEE) -o ../../public/js/main/$(MODULE_NAME) -c public/coffee/main/; fi
@$(MAKE) compile
test_acceptance: test_acceptance_start_service test_acceptance_run
$(MAKE) test_acceptance_stop_service
test_acceptance_start_service: test_acceptance_stop_service
$(DOCKER_COMPOSE) up -d test_acceptance
test_acceptance_stop_service:
$(DOCKER_COMPOSE) stop test_acceptance redis mongo
test_acceptance_run:
$(DOCKER_COMPOSE) exec -T test_acceptance npm -q run test:acceptance:dir -- ${MOCHA_ARGS} $(MODULE_DIR)/test/acceptance/js

View file

@ -1,16 +0,0 @@
#!/bin/bash
set -e;
COFFEE=node_modules/.bin/coffee
echo Compiling test/acceptance/coffee;
$COFFEE -o test/acceptance/js -c test/acceptance/coffee;
for dir in modules/*;
do
if [ -d $dir/test/acceptance ]; then
echo Compiling $dir/test/acceptance/coffee;
$COFFEE -o $dir/test/acceptance/js -c $dir/test/acceptance/coffee;
fi
done

View file

@ -1,23 +0,0 @@
#!/bin/bash
set -e;
COFFEE=node_modules/.bin/coffee
echo Compiling app.coffee;
$COFFEE -c app.coffee;
echo Compiling app/coffee;
$COFFEE -o app/js -c app/coffee;
for dir in modules/*;
do
if [ -d $dir/app/coffee ]; then
echo Compiling $dir/app/coffee;
$COFFEE -o $dir/app/js -c $dir/app/coffee;
fi
if [ -e $dir/index.coffee ]; then
echo Compiling $dir/index.coffee;
$COFFEE -c $dir/index.coffee;
fi
done

View file

@ -1,5 +0,0 @@
#!/bin/bash
set -e;
COFFEE=node_modules/.bin/coffee
echo Compiling public/coffee;
$COFFEE -o public/js -c public/coffee;

View file

@ -1,5 +0,0 @@
#!/bin/bash
set -e;
COFFEE=node_modules/.bin/coffee
echo Compiling test/unit_frontend/coffee;
$COFFEE -o test/unit_frontend/js -c test/unit_frontend/coffee;

View file

@ -1,15 +0,0 @@
#!/bin/bash
set -e;
COFFEE=node_modules/.bin/coffee
echo Compiling test/unit/coffee;
$COFFEE -o test/unit/js -c test/unit/coffee;
for dir in modules/*;
do
if [ -d $dir/test/unit ]; then
echo Compiling $dir/test/unit/coffee;
$COFFEE -o $dir/test/unit/js -c $dir/test/unit/coffee;
fi
done

View file

@ -1,26 +0,0 @@
#!/usr/bin/env python2
from os import listdir
from os.path import isfile, isdir, join
volumes = []
for module in listdir("modules/"):
if module[0] != '.':
if isfile(join("modules", module, 'index.coffee')):
volumes.append(join("modules", module, 'index.coffee'))
for directory in ['app/coffee', 'app/views', 'public/coffee', 'test/unit/coffee', 'test/acceptance/coffee', 'test/acceptance/config', 'test/acceptance/files']:
if isdir(join("modules", module, directory)):
volumes.append(join("modules", module, directory))
volumes_string = map(lambda vol: "- ./" + vol + ":/app/" + vol + ":ro", volumes)
volumes_string = "\n ".join(volumes_string)
with open("docker-shared.template.yml", "r") as f:
docker_shared_file = f.read()
docker_shared_file = docker_shared_file.replace("MODULE_VOLUMES", volumes_string)
with open("docker-shared.yml", "w") as f:
f.write(docker_shared_file)

View file

@ -0,0 +1,29 @@
#!/bin/bash
set -e
# Branding
rm -rf public/brand app/views/external
git clone -b master git@github.com:sharelatex/brand-sharelatex public/brand
git clone -b master git@github.com:sharelatex/external-pages-sharelatex app/views/external
mv app/views/external/robots.txt public/robots.txt
mv app/views/external/googlebdb0f8f7f4a17241.html public/googlebdb0f8f7f4a17241.html
# Web Modules
rm -rf modules
BRANCH_NAME=${BRANCH_NAME:="$(git rev-parse --abbrev-ref HEAD)-nope"}
function install_module {
echo "Cloning $1 from $BRANCH_NAME branch"
git clone -b $BRANCH_NAME $1 $2 || {
echo "Cloning from $BRANCH_NAME failed - it likely doesn't exist. Using master";
git clone -b master $1 $2;
}
}
install_module git@github.com:sharelatex/web-sharelatex-modules modules
install_module git@github.com:sharelatex/admin-panel modules/admin-panel
install_module git@github.com:sharelatex/tpr-webmodule.git modules/tpr-webmodule
install_module git@github.com:sharelatex/templates-webmodule.git modules/templates
install_module git@github.com:sharelatex/track-changes-web-module.git modules/track-changes
install_module git@github.com:sharelatex/overleaf-integration-web-module.git modules/overleaf-integration
install_module git@github.com:sharelatex/overleaf-account-merge.git modules/overleaf-account-merge
install_module git@github.com:sharelatex/references-search.git modules/references-search
install_module git@github.com:sharelatex/learn-wiki-web-module.git modules/learn-wiki

View file

@ -35,7 +35,7 @@ module.exports = settings =
# Databases
# ---------
mongo:
url : process.env['MONGO_URL'] || "mongodb://127.0.0.1/sharelatex"
url : process.env['MONGO_URL'] || "mongodb://#{process.env['MONGO_HOST'] or '127.0.0.1'}/sharelatex"
redis:
web:
@ -96,62 +96,61 @@ module.exports = settings =
# options incase you want to run some services on remote hosts.
apis:
web:
url: "http://localhost:#{webPort}"
url: "http://#{process.env['WEB_HOST'] or 'localhost'}:#{webPort}"
user: httpAuthUser
pass: httpAuthPass
documentupdater:
url : "http://#{process.env['DOCUPDATER_HOST'] or 'localhost'}:#{docUpdaterPort}"
thirdPartyDataStore:
url : "http://localhost:3002"
url : "http://#{process.env['TPDS_HOST'] or 'localhost'}:3002"
emptyProjectFlushDelayMiliseconds: 5 * seconds
tags:
url :"http://localhost:3012"
url :"http://#{process.env['TAGS_HOST'] or 'localhost'}:3012"
spelling:
url : "http://localhost:3005"
url : "http://#{process.env['SPELLING_HOST'] or 'localhost'}:3005"
trackchanges:
url : "http://localhost:3015"
url : "http://#{process.env['TRACK_CHANGES_HOST'] or 'localhost'}:3015"
project_history:
sendProjectStructureOps: process.env.PROJECT_HISTORY_ENABLED == 'true' or false
initializeHistoryForNewProjects: process.env.PROJECT_HISTORY_ENABLED == 'true' or false
displayHistoryForNewProjects: process.env.PROJECT_HISTORY_ENABLED == 'true' or false
url : "http://localhost:3054"
url : "http://#{process.env['PROJECT_HISTORY_HOST'] or 'localhost'}:3054"
docstore:
url : "http://#{process.env['DOCSTORE_HOST'] or 'localhost'}:3016"
pubUrl: "http://localhost:3016"
pubUrl: "http://#{process.env['DOCSTORE_HOST'] or 'localhost'}:3016"
chat:
url: "http://localhost:3010"
internal_url: "http://localhost:3010"
url: "http://#{process.env['CHAT_HOST'] or 'localhost'}:3010"
internal_url: "http://#{process.env['CHAT_HOST'] or 'localhost'}:3010"
blog:
port: 3008
university:
url: "http://localhost:3011"
filestore:
url: "http://localhost:3009"
url: "http://#{process.env['FILESTORE_HOST'] or 'localhost'}:3009"
clsi:
url: "http://localhost:3013"
url: "http://#{process.env['CLSI_HOST'] or 'localhost'}:3013"
templates:
url: "http://localhost:3007"
url: "http://#{process.env['TEMPLATES_HOST'] or 'localhost'}:3007"
githubSync:
url: "http://localhost:3022"
url: "http://#{process.env['GITHUB_SYNC_HOST'] or 'localhost'}:3022"
recurly:
privateKey: ""
apiKey: ""
subdomain: ""
geoIpLookup:
url: "http://localhost:8080/json"
url: "http://#{process.env['GEOIP_HOST'] or 'localhost'}:8080/json"
realTime:
url: "http://localhost:3026"
url: "http://#{process.env['REALTIME_HOST'] or 'localhost'}:3026"
contacts:
url: "http://localhost:3036"
url: "http://#{process.env['CONTACTS_HOST'] or 'localhost'}:3036"
sixpack:
url: ""
# references:
# url: "http://localhost:3040"
notifications:
url: "http://localhost:3042"
url: "http://#{process.env['NOTIFICATIONS_HOST'] or 'localhost'}:3042"
analytics:
url: "http://localhost:3050"
url: "http://#{process.env['ANALYTICS_HOST'] or 'localhost'}:3050"
templates:
user_id: process.env.TEMPLATES_USER_ID or "5395eb7aad1f29a88756c7f2"
@ -165,7 +164,7 @@ module.exports = settings =
# Where your instance of ShareLaTeX can be found publically. Used in emails
# that are sent out, generated links, etc.
siteUrl : siteUrl = 'http://localhost:3000'
siteUrl : siteUrl = process.env['PUBLIC_URL'] or 'http://localhost:3000'
# cookie domain
# use full domain for cookies to only be accessible from that domain,
@ -293,7 +292,7 @@ module.exports = settings =
# Should javascript assets be served minified or not. Note that you will
# need to run `grunt compile:minify` within the web-sharelatex directory
# to generate these.
useMinifiedJs: false
useMinifiedJs: process.env['MINIFIED_JS'] == 'true' or false
# Should static assets be sent with a header to tell the browser to cache
# them.

View file

@ -1,25 +1,15 @@
version: "2"
volumes:
node_modules:
data:
services:
npm:
extends:
file: docker-shared.yml
service: app
command: npm install
test_unit:
extends:
file: docker-shared.yml
service: app
command: npm run test:unit
test_acceptance:
extends:
file: docker-shared.yml
service: app
image: node:6.9.5
volumes:
- .:/app:ro
- data:/app/data:rw
working_dir: /app
environment:
REDIS_HOST: redis
MONGO_URL: "mongodb://mongo/sharelatex"
@ -28,7 +18,7 @@ services:
depends_on:
- redis
- mongo
command: npm run start
command: node app.js
redis:
image: redis

View file

@ -1,29 +0,0 @@
version: "2"
# We mount all the directories explicitly so that we are only mounting
# the coffee directories, so that the compiled js is only written inside
# the container, and not back to the local filesystem, where it would be
# root owned, and conflict with working outside of the container.
services:
app:
image: node:6.9.5
volumes:
- ./package.json:/app/package.json
- ./npm-shrinkwrap.json:/app/npm-shrinkwrap.json
- node_modules:/app/node_modules
- ./bin:/app/bin
- ./public/coffee:/app/public/coffee:ro
- ./public/js/ace-1.2.5:/app/public/js/ace-1.2.5
- ./app.coffee:/app/app.coffee:ro
- ./app/coffee:/app/app/coffee:ro
- ./app/templates:/app/app/templates:ro
- ./app/views:/app/app/views:ro
- ./config:/app/config
- ./test/unit/coffee:/app/test/unit/coffee:ro
- ./test/unit_frontend/coffee:/app/test/unit_frontend/coffee:ro
- ./test/acceptance/coffee:/app/test/acceptance/coffee:ro
- ./test/acceptance/files:/app/test/acceptance/files:ro
- ./test/smoke/coffee:/app/test/smoke/coffee:ro
MODULE_VOLUMES
working_dir: /app

View file

@ -0,0 +1,13 @@
{
"ignore": [
".git",
"node_modules/"
],
"verbose": true,
"exec": "make compile",
"watch": [
"public/coffee/",
"public/stylesheets/"
],
"ext": "coffee less"
}

16
services/web/nodemon.json Normal file
View file

@ -0,0 +1,16 @@
{
"ignore": [
".git",
"node_modules/"
],
"verbose": true,
"execMap": {
"js": "npm run start"
},
"watch": [
"app/coffee/",
"app.coffee",
"modules/*/app/coffee/"
],
"ext": "coffee"
}

View file

@ -1292,6 +1292,15 @@
"resolved": "https://registry.npmjs.org/double-ended-queue/-/double-ended-queue-2.1.0-0.tgz",
"integrity": "sha1-ED01J/0xUo9AGIEwyEHv3XgmTlw="
},
"dtrace-provider": {
"version": "0.6.0",
"resolved": "https://registry.npmjs.org/dtrace-provider/-/dtrace-provider-0.6.0.tgz",
"integrity": "sha1-CweNVReTfYcxAUUtkUZzdVe3XlE=",
"optional": true,
"requires": {
"nan": "2.3.5"
}
},
"each-series": {
"version": "1.0.0",
"resolved": "https://registry.npmjs.org/each-series/-/each-series-1.0.0.tgz",
@ -3607,15 +3616,6 @@
"type-detect": "4.0.5"
}
},
"dtrace-provider": {
"version": "0.6.0",
"resolved": "https://registry.npmjs.org/dtrace-provider/-/dtrace-provider-0.6.0.tgz",
"integrity": "sha1-CweNVReTfYcxAUUtkUZzdVe3XlE=",
"optional": true,
"requires": {
"nan": "2.3.5"
}
},
"formatio": {
"version": "1.2.0",
"resolved": "https://registry.npmjs.org/formatio/-/formatio-1.2.0.tgz",

View file

@ -12,17 +12,14 @@
"scripts": {
"test:acceptance:wait_for_app": "echo 'Waiting for app to be accessible' && while (! curl -s -o /dev/null localhost:3000/status) do sleep 1; done",
"test:acceptance:run": "bin/acceptance_test $@",
"test:acceptance:dir": "npm -q run compile:acceptance_tests && npm -q run test:acceptance:wait_for_app && npm -q run test:acceptance:run -- $@",
"test:acceptance:dir": "npm -q run test:acceptance:wait_for_app && npm -q run test:acceptance:run -- $@",
"test:acceptance": "npm -q run test:acceptance:dir -- $@ test/acceptance/js",
"test:unit": "npm -q run compile:backend && npm -q run compile:unit_tests && bin/unit_test $@",
"test:frontend": "npm -q run compile:frontend && npm -q run compile:frontend_tests && bin/frontend_test $@",
"compile:unit_tests": "bin/compile_unit_tests",
"compile:frontend_tests": "bin/compile_frontend_tests",
"compile:acceptance_tests": "bin/compile_acceptance_tests",
"compile:frontend": "bin/compile_frontend",
"compile:backend": "bin/compile_backend",
"compile": "npm -q run compile:backend && npm -q run compile:frontend",
"start": "npm -q run compile && node app.js"
"test:unit": "npm -q run compile && bin/unit_test $@",
"test:frontend": "npm -q run compile && bin/frontend_test $@",
"compile": "make compile",
"start": "npm -q run compile && node app.js",
"nodemon": "nodemon --config nodemon.json",
"nodemon:frontend": "nodemon --config nodemon.frontend.json"
},
"dependencies": {
"archiver": "0.9.0",
@ -94,10 +91,12 @@
"chai": "3.5.0",
"chai-spies": "",
"clean-css": "^3.4.18",
"coffee-script": "^1.7.1",
"es6-promise": "^4.0.5",
"grunt": "0.4.5",
"grunt-available-tasks": "0.4.1",
"grunt-bunyan": "0.5.0",
"grunt-cli": "^1.2.0",
"grunt-contrib-clean": "0.5.0",
"grunt-contrib-coffee": "0.10.0",
"grunt-contrib-less": "0.9.0",
@ -116,6 +115,7 @@
"grunt-sed": "^0.1.1",
"grunt-shell": "^2.1.0",
"mkdirp": "https://registry.npmjs.org/mkdirp/-/mkdirp-0.5.1.tgz",
"nodemon": "^1.14.3",
"sandboxed-module": "0.2.0",
"sinon": "^1.17.0",
"timekeeper": "",

View file

@ -1,332 +0,0 @@
// Generated by CoffeeScript 1.4.0
(function() {
var Doc, MicroEvent, types,
__bind = function(fn, me){ return function(){ return fn.apply(me, arguments); }; },
__indexOf = [].indexOf || function(item) { for (var i = 0, l = this.length; i < l; i++) { if (i in this && this[i] === item) return i; } return -1; };
if (typeof WEB === "undefined" || WEB === null) {
types = require('../types');
}
if (typeof WEB !== "undefined" && WEB !== null) {
exports.extendDoc = function(name, fn) {
return Doc.prototype[name] = fn;
};
}
Doc = (function() {
function Doc(connection, name, openData) {
this.connection = connection;
this.name = name;
this.shout = __bind(this.shout, this);
this.flush = __bind(this.flush, this);
openData || (openData = {});
this.version = openData.v;
this.snapshot = openData.snaphot;
if (openData.type) {
this._setType(openData.type);
}
this.state = 'closed';
this.autoOpen = false;
this._create = openData.create;
this.inflightOp = null;
this.inflightCallbacks = [];
this.inflightSubmittedIds = [];
this.pendingOp = null;
this.pendingCallbacks = [];
this.serverOps = {};
}
Doc.prototype._xf = function(client, server) {
var client_, server_;
if (this.type.transformX) {
return this.type.transformX(client, server);
} else {
client_ = this.type.transform(client, server, 'left');
server_ = this.type.transform(server, client, 'right');
return [client_, server_];
}
};
Doc.prototype._otApply = function(docOp, isRemote) {
var oldSnapshot;
oldSnapshot = this.snapshot;
this.snapshot = this.type.apply(this.snapshot, docOp);
this.emit('change', docOp, oldSnapshot);
if (isRemote) {
return this.emit('remoteop', docOp, oldSnapshot);
}
};
Doc.prototype._connectionStateChanged = function(state, data) {
switch (state) {
case 'disconnected':
this.state = 'closed';
if (this.inflightOp) {
this.inflightSubmittedIds.push(this.connection.id);
}
this.emit('closed');
break;
case 'ok':
if (this.autoOpen) {
this.open();
}
break;
case 'stopped':
if (typeof this._openCallback === "function") {
this._openCallback(data);
}
}
return this.emit(state, data);
};
Doc.prototype._setType = function(type) {
var k, v, _ref;
if (typeof type === 'string') {
type = types[type];
}
if (!(type && type.compose)) {
throw new Error('Support for types without compose() is not implemented');
}
this.type = type;
if (type.api) {
_ref = type.api;
for (k in _ref) {
v = _ref[k];
this[k] = v;
}
return typeof this._register === "function" ? this._register() : void 0;
} else {
return this.provides = {};
}
};
Doc.prototype._onMessage = function(msg) {
var callback, docOp, error, oldInflightOp, op, path, response, undo, value, _i, _j, _len, _len1, _ref, _ref1, _ref2, _ref3, _ref4, _ref5, _ref6;
if (msg.open === true) {
this.state = 'open';
this._create = false;
if (this.created == null) {
this.created = !!msg.create;
}
if (msg.type) {
this._setType(msg.type);
}
if (msg.create) {
this.created = true;
this.snapshot = this.type.create();
} else {
if (this.created !== true) {
this.created = false;
}
if (msg.snapshot !== void 0) {
this.snapshot = msg.snapshot;
}
}
if (msg.v != null) {
this.version = msg.v;
}
if (this.inflightOp) {
response = {
doc: this.name,
op: this.inflightOp,
v: this.version
};
if (this.inflightSubmittedIds.length) {
response.dupIfSource = this.inflightSubmittedIds;
}
this.connection.send(response);
} else {
this.flush();
}
this.emit('open');
return typeof this._openCallback === "function" ? this._openCallback(null) : void 0;
} else if (msg.open === false) {
if (msg.error) {
if (typeof console !== "undefined" && console !== null) {
console.error("Could not open document: " + msg.error);
}
this.emit('error', msg.error);
if (typeof this._openCallback === "function") {
this._openCallback(msg.error);
}
}
this.state = 'closed';
this.emit('closed');
if (typeof this._closeCallback === "function") {
this._closeCallback();
}
return this._closeCallback = null;
} else if (msg.op === null && error === 'Op already submitted') {
} else if ((msg.op === void 0 && msg.v !== void 0) || (msg.op && (_ref = msg.meta.source, __indexOf.call(this.inflightSubmittedIds, _ref) >= 0))) {
oldInflightOp = this.inflightOp;
this.inflightOp = null;
this.inflightSubmittedIds.length = 0;
error = msg.error;
if (error) {
if (this.type.invert) {
undo = this.type.invert(oldInflightOp);
if (this.pendingOp) {
_ref1 = this._xf(this.pendingOp, undo), this.pendingOp = _ref1[0], undo = _ref1[1];
}
this._otApply(undo, true);
} else {
this.emit('error', "Op apply failed (" + error + ") and the op could not be reverted");
}
_ref2 = this.inflightCallbacks;
for (_i = 0, _len = _ref2.length; _i < _len; _i++) {
callback = _ref2[_i];
callback(error);
}
} else {
if (msg.v !== this.version) {
throw new Error('Invalid version from server');
}
this.serverOps[this.version] = oldInflightOp;
this.version++;
this.emit('acknowledge', oldInflightOp);
_ref3 = this.inflightCallbacks;
for (_j = 0, _len1 = _ref3.length; _j < _len1; _j++) {
callback = _ref3[_j];
callback(null, oldInflightOp);
}
}
return this.flush();
} else if (msg.op) {
if (msg.v < this.version) {
return;
}
if (msg.doc !== this.name) {
return this.emit('error', "Expected docName '" + this.name + "' but got " + msg.doc);
}
if (msg.v !== this.version) {
return this.emit('error', "Expected version " + this.version + " but got " + msg.v);
}
op = msg.op;
this.serverOps[this.version] = op;
docOp = op;
if (this.inflightOp !== null) {
_ref4 = this._xf(this.inflightOp, docOp), this.inflightOp = _ref4[0], docOp = _ref4[1];
}
if (this.pendingOp !== null) {
_ref5 = this._xf(this.pendingOp, docOp), this.pendingOp = _ref5[0], docOp = _ref5[1];
}
this.version++;
return this._otApply(docOp, true);
} else if (msg.meta) {
_ref6 = msg.meta, path = _ref6.path, value = _ref6.value;
switch (path != null ? path[0] : void 0) {
case 'shout':
return this.emit('shout', value);
default:
return typeof console !== "undefined" && console !== null ? console.warn('Unhandled meta op:', msg) : void 0;
}
} else {
return typeof console !== "undefined" && console !== null ? console.warn('Unhandled document message:', msg) : void 0;
}
};
Doc.prototype.flush = function() {
if (!(this.connection.state === 'ok' && this.inflightOp === null && this.pendingOp !== null)) {
return;
}
this.inflightOp = this.pendingOp;
this.inflightCallbacks = this.pendingCallbacks;
this.pendingOp = null;
this.pendingCallbacks = [];
return this.connection.send({
doc: this.name,
op: this.inflightOp,
v: this.version
});
};
Doc.prototype.submitOp = function(op, callback) {
if (this.type.normalize != null) {
op = this.type.normalize(op);
}
this.snapshot = this.type.apply(this.snapshot, op);
if (this.pendingOp !== null) {
this.pendingOp = this.type.compose(this.pendingOp, op);
} else {
this.pendingOp = op;
}
if (callback) {
this.pendingCallbacks.push(callback);
}
this.emit('change', op);
return setTimeout(this.flush, 0);
};
Doc.prototype.shout = function(msg) {
return this.connection.send({
doc: this.name,
meta: {
path: ['shout'],
value: msg
}
});
};
Doc.prototype.open = function(callback) {
var message,
_this = this;
this.autoOpen = true;
if (this.state !== 'closed') {
return;
}
message = {
doc: this.name,
open: true
};
if (this.snapshot === void 0) {
message.snapshot = null;
}
if (this.type) {
message.type = this.type.name;
}
if (this.version != null) {
message.v = this.version;
}
if (this._create) {
message.create = true;
}
this.connection.send(message);
this.state = 'opening';
return this._openCallback = function(error) {
_this._openCallback = null;
return typeof callback === "function" ? callback(error) : void 0;
};
};
Doc.prototype.close = function(callback) {
this.autoOpen = false;
if (this.state === 'closed') {
return typeof callback === "function" ? callback() : void 0;
}
this.connection.send({
doc: this.name,
open: false
});
this.state = 'closed';
this.emit('closing');
return this._closeCallback = callback;
};
return Doc;
})();
if (typeof WEB === "undefined" || WEB === null) {
MicroEvent = require('./microevent');
}
MicroEvent.mixin(Doc);
exports.Doc = Doc;
}).call(this);

View file

@ -1,85 +0,0 @@
// Generated by CoffeeScript 1.4.0
(function() {
var MicroEvent, nextTick,
__slice = [].slice;
nextTick = typeof WEB !== "undefined" && WEB !== null ? function(fn) {
return setTimeout(fn, 0);
} : process['nextTick'];
MicroEvent = (function() {
function MicroEvent() {}
MicroEvent.prototype.on = function(event, fct) {
var _base;
this._events || (this._events = {});
(_base = this._events)[event] || (_base[event] = []);
this._events[event].push(fct);
return this;
};
MicroEvent.prototype.removeListener = function(event, fct) {
var i, listeners, _base,
_this = this;
this._events || (this._events = {});
listeners = ((_base = this._events)[event] || (_base[event] = []));
i = 0;
while (i < listeners.length) {
if (listeners[i] === fct) {
listeners[i] = void 0;
}
i++;
}
nextTick(function() {
var x;
return _this._events[event] = (function() {
var _i, _len, _ref, _results;
_ref = this._events[event];
_results = [];
for (_i = 0, _len = _ref.length; _i < _len; _i++) {
x = _ref[_i];
if (x) {
_results.push(x);
}
}
return _results;
}).call(_this);
});
return this;
};
MicroEvent.prototype.emit = function() {
var args, event, fn, _i, _len, _ref, _ref1;
event = arguments[0], args = 2 <= arguments.length ? __slice.call(arguments, 1) : [];
if (!((_ref = this._events) != null ? _ref[event] : void 0)) {
return this;
}
_ref1 = this._events[event];
for (_i = 0, _len = _ref1.length; _i < _len; _i++) {
fn = _ref1[_i];
if (fn) {
fn.apply(this, args);
}
}
return this;
};
return MicroEvent;
})();
MicroEvent.mixin = function(obj) {
var proto;
proto = obj.prototype || obj;
proto.on = MicroEvent.prototype.on;
proto.removeListener = MicroEvent.prototype.removeListener;
proto.emit = MicroEvent.prototype.emit;
return obj;
};
if (typeof WEB === "undefined" || WEB === null) {
module.exports = MicroEvent;
}
}).call(this);

View file

@ -1,58 +0,0 @@
// Generated by CoffeeScript 1.4.0
(function() {
var text;
if (typeof WEB === 'undefined') {
text = require('./text');
}
text.api = {
provides: {
text: true
},
getLength: function() {
return this.snapshot.length;
},
getText: function() {
return this.snapshot;
},
insert: function(pos, text, callback) {
var op;
op = [
{
p: pos,
i: text
}
];
this.submitOp(op, callback);
return op;
},
del: function(pos, length, callback) {
var op;
op = [
{
p: pos,
d: this.snapshot.slice(pos, pos + length)
}
];
this.submitOp(op, callback);
return op;
},
_register: function() {
return this.on('remoteop', function(op) {
var component, _i, _len, _results;
_results = [];
for (_i = 0, _len = op.length; _i < _len; _i++) {
component = op[_i];
if (component.i !== void 0) {
_results.push(this.emit('insert', component.p, component.i));
} else {
_results.push(this.emit('delete', component.p, component.d));
}
}
return _results;
});
}
};
}).call(this);

View file

@ -1,239 +0,0 @@
// Generated by CoffeeScript 1.4.0
(function() {
var append, checkValidComponent, checkValidOp, invertComponent, strInject, text, transformComponent, transformPosition;
text = {};
text.name = 'text';
text.create = function() {
return '';
};
strInject = function(s1, pos, s2) {
return s1.slice(0, pos) + s2 + s1.slice(pos);
};
checkValidComponent = function(c) {
var d_type, i_type;
if (typeof c.p !== 'number') {
throw new Error('component missing position field');
}
i_type = typeof c.i;
d_type = typeof c.d;
if (!((i_type === 'string') ^ (d_type === 'string'))) {
throw new Error('component needs an i or d field');
}
if (!(c.p >= 0)) {
throw new Error('position cannot be negative');
}
};
checkValidOp = function(op) {
var c, _i, _len;
for (_i = 0, _len = op.length; _i < _len; _i++) {
c = op[_i];
checkValidComponent(c);
}
return true;
};
text.apply = function(snapshot, op) {
var component, deleted, _i, _len;
checkValidOp(op);
for (_i = 0, _len = op.length; _i < _len; _i++) {
component = op[_i];
if (component.i != null) {
snapshot = strInject(snapshot, component.p, component.i);
} else {
deleted = snapshot.slice(component.p, component.p + component.d.length);
if (component.d !== deleted) {
throw new Error("Delete component '" + component.d + "' does not match deleted text '" + deleted + "'");
}
snapshot = snapshot.slice(0, component.p) + snapshot.slice(component.p + component.d.length);
}
}
return snapshot;
};
text._append = append = function(newOp, c) {
var last, _ref, _ref1;
if (c.i === '' || c.d === '') {
return;
}
if (newOp.length === 0) {
return newOp.push(c);
} else {
last = newOp[newOp.length - 1];
if ((last.i != null) && (c.i != null) && (last.p <= (_ref = c.p) && _ref <= (last.p + last.i.length))) {
return newOp[newOp.length - 1] = {
i: strInject(last.i, c.p - last.p, c.i),
p: last.p
};
} else if ((last.d != null) && (c.d != null) && (c.p <= (_ref1 = last.p) && _ref1 <= (c.p + c.d.length))) {
return newOp[newOp.length - 1] = {
d: strInject(c.d, last.p - c.p, last.d),
p: c.p
};
} else {
return newOp.push(c);
}
}
};
text.compose = function(op1, op2) {
var c, newOp, _i, _len;
checkValidOp(op1);
checkValidOp(op2);
newOp = op1.slice();
for (_i = 0, _len = op2.length; _i < _len; _i++) {
c = op2[_i];
append(newOp, c);
}
return newOp;
};
text.compress = function(op) {
return text.compose([], op);
};
text.normalize = function(op) {
var c, newOp, _i, _len, _ref;
newOp = [];
if ((op.i != null) || (op.p != null)) {
op = [op];
}
for (_i = 0, _len = op.length; _i < _len; _i++) {
c = op[_i];
if ((_ref = c.p) == null) {
c.p = 0;
}
append(newOp, c);
}
return newOp;
};
transformPosition = function(pos, c, insertAfter) {
if (c.i != null) {
if (c.p < pos || (c.p === pos && insertAfter)) {
return pos + c.i.length;
} else {
return pos;
}
} else {
if (pos <= c.p) {
return pos;
} else if (pos <= c.p + c.d.length) {
return c.p;
} else {
return pos - c.d.length;
}
}
};
text.transformCursor = function(position, op, side) {
var c, insertAfter, _i, _len;
insertAfter = side === 'right';
for (_i = 0, _len = op.length; _i < _len; _i++) {
c = op[_i];
position = transformPosition(position, c, insertAfter);
}
return position;
};
text._tc = transformComponent = function(dest, c, otherC, side) {
var cIntersect, intersectEnd, intersectStart, newC, otherIntersect, s;
checkValidOp([c]);
checkValidOp([otherC]);
if (c.i != null) {
append(dest, {
i: c.i,
p: transformPosition(c.p, otherC, side === 'right')
});
} else {
if (otherC.i != null) {
s = c.d;
if (c.p < otherC.p) {
append(dest, {
d: s.slice(0, otherC.p - c.p),
p: c.p
});
s = s.slice(otherC.p - c.p);
}
if (s !== '') {
append(dest, {
d: s,
p: c.p + otherC.i.length
});
}
} else {
if (c.p >= otherC.p + otherC.d.length) {
append(dest, {
d: c.d,
p: c.p - otherC.d.length
});
} else if (c.p + c.d.length <= otherC.p) {
append(dest, c);
} else {
newC = {
d: '',
p: c.p
};
if (c.p < otherC.p) {
newC.d = c.d.slice(0, otherC.p - c.p);
}
if (c.p + c.d.length > otherC.p + otherC.d.length) {
newC.d += c.d.slice(otherC.p + otherC.d.length - c.p);
}
intersectStart = Math.max(c.p, otherC.p);
intersectEnd = Math.min(c.p + c.d.length, otherC.p + otherC.d.length);
cIntersect = c.d.slice(intersectStart - c.p, intersectEnd - c.p);
otherIntersect = otherC.d.slice(intersectStart - otherC.p, intersectEnd - otherC.p);
if (cIntersect !== otherIntersect) {
throw new Error('Delete ops delete different text in the same region of the document');
}
if (newC.d !== '') {
newC.p = transformPosition(newC.p, otherC);
append(dest, newC);
}
}
}
}
return dest;
};
invertComponent = function(c) {
if (c.i != null) {
return {
d: c.i,
p: c.p
};
} else {
return {
i: c.d,
p: c.p
};
}
};
text.invert = function(op) {
var c, _i, _len, _ref, _results;
_ref = op.slice().reverse();
_results = [];
for (_i = 0, _len = _ref.length; _i < _len; _i++) {
c = _ref[_i];
_results.push(invertComponent(c));
}
return _results;
};
if (typeof WEB !== "undefined" && WEB !== null) {
exports.types || (exports.types = {});
bootstrapTransform(text, transformComponent, checkValidOp, append);
exports.types.text = text;
} else {
module.exports = text;
require('./helpers').bootstrapTransform(text, transformComponent, checkValidOp, append);
}
}).call(this);

View file

@ -24,7 +24,7 @@ define [
$(document).on "click", =>
@clearMultiSelectedEntities()
$scope.$digest()
@$scope.$digest()
_bindToSocketEvents: () ->
@ide.socket.on "reciveNewDoc", (parent_folder_id, doc) =>