Skip to content
This repository was archived by the owner on Feb 3, 2021. It is now read-only.

Commit 66037fd

Browse files
authored
Feature: Add VSTS CI (#561)
* fix job submission bug and integration tests * merge * update tests, add vsts-ci.yml * add python step * debug statements * update build * update build add print * build update * fix bug * debug * undo * parallelize build * add trigger * typo * remove env * whitespace * whitespace * whitespace * remove debug branch
1 parent 1527929 commit 66037fd

File tree

8 files changed

+453
-355
lines changed

8 files changed

+453
-355
lines changed

.vscode/launch.json

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -35,7 +35,7 @@
3535
"spark", "cluster", "create", "--id", "spark-debug"
3636
],
3737
"env": {},
38-
"envFile": "${workspaceFolder}/.env",
38+
"envFile": "${workspaceFolder}/.venv",
3939
"debugOptions": [
4040
"RedirectOutput"
4141
]

.vsts-ci.yml

Lines changed: 20 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,20 @@
1+
trigger:
2+
- master
3+
4+
steps:
5+
- task: UsePythonVersion@0
6+
inputs:
7+
versionSpec: '>= 3.5'
8+
addToPath: true
9+
architecture: 'x64'
10+
11+
- script: |
12+
pip install -r requirements.txt
13+
pip install -e .
14+
condition: and(succeeded(), eq(variables['agent.os'], 'linux'))
15+
displayName: install aztk
16+
17+
- script: |
18+
pytest -n 50
19+
condition: and(succeeded(), in(variables['agent.os'], 'linux'))
20+
displayName: pytest

aztk/spark/helpers/job_submission.py

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -22,7 +22,8 @@ def __app_cmd():
2222
docker_exec.add_argument("spark /bin/bash >> output.log 2>&1 -c \"" \
2323
"source ~/.bashrc; " \
2424
"export PYTHONPATH=$PYTHONPATH:\$AZTK_WORKING_DIR; " \
25-
"$AZTK_WORKING_DIR/.aztk-env/.venv/bin/python \$AZTK_WORKING_DIR/aztk/node_scripts/job_submission.py\"")
25+
"cd \$AZ_BATCH_TASK_WORKING_DIR; " \
26+
"\$AZTK_WORKING_DIR/.aztk-env/.venv/bin/python \$AZTK_WORKING_DIR/aztk/node_scripts/job_submission.py\"")
2627
return docker_exec.to_str()
2728

2829

aztk/spark/helpers/submit.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -84,7 +84,7 @@ def generate_task(spark_client, container_id, application):
8484
task_cmd.add_argument('spark /bin/bash >> output.log 2>&1')
8585
task_cmd.add_argument('-c "source ~/.bashrc; ' \
8686
'export PYTHONPATH=$PYTHONPATH:\$AZTK_WORKING_DIR; ' \
87-
'cd $AZ_BATCH_TASK_WORKING_DIR; ' \
87+
'cd \$AZ_BATCH_TASK_WORKING_DIR; ' \
8888
'\$AZTK_WORKING_DIR/.aztk-env/.venv/bin/python \$AZTK_WORKING_DIR/aztk/node_scripts/submit.py"')
8989

9090
# Create task

aztk/spark/models/models.py

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -8,6 +8,7 @@
88
class SparkToolkit(aztk.models.Toolkit):
99
def __init__(self, version: str, environment: str = None, environment_version: str = None):
1010
super().__init__(
11+
software="spark",
1112
version=version,
1213
environment=environment,
1314
environment_version=environment_version,

aztk/spark/utils/util.py

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -45,4 +45,3 @@ def wait_for_master_to_be_ready(client, cluster_id: str):
4545
"Master didn't become ready before timeout.")
4646

4747
time.sleep(10)
48-
time.sleep(5)

0 commit comments

Comments
 (0)