Skip to content
This repository was archived by the owner on Feb 3, 2021. It is now read-only.

Feature: Readthedocs support#497

Merged
timotheeguerin merged 43 commits intomasterfrom
feature/readthedocs
Apr 26, 2018
Merged

Feature: Readthedocs support#497
timotheeguerin merged 43 commits intomasterfrom
feature/readthedocs

Conversation

@timotheeguerin
Copy link
Member

@timotheeguerin timotheeguerin commented Apr 18, 2018


```python
# define a custom script
custom_script = aztk.spark.models.CustomScript(
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Since custom scripts are going away in favor of plugins, we should probably leave this out.

status = client.get_application_status(cluster_config.cluster_id, app2.name)
```

## stream logs of app, print to console as it runs
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

"stream" should be "Stream"

```


## Run application against cluster
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I feel like "Run an application on the cluster" is better than "against".

@@ -0,0 +1,40 @@
Welcome to aztk's documentation!
================================
Azure Distributed Data Engineering Toolkit (AZTK) is a python CLI application for provisioning on-demand Spark on Docker clusters in Azure. It's a cheap and easy way to get up and running with a Spark cluster, and a great tool for Spark users who want to experiment and start testing at scale.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We should pick between aztk and AZTK and standardize in our docs.

@jafreck
Copy link
Member

jafreck commented Apr 23, 2018

In the SDK docs, I think the only package we should have is the aztk.spark package with only aztk.spark.models and aztk.spark.client underneath it (not utils). Those should be the only public facing modules.

edit: and the aztk.error module

Each Job has one or more applications given as a List in Job.yaml. Applications are defined using the following properties:
```yaml
applications:
- name:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I thought we decided to leave in white spaces in yaml since it is required.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hhm i guess its the auto formatting that removed those

pytest==3.1.3
pytest-xdist==1.22.0
twine==1.9.1
docker==3.2.1
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why is this here?

@timotheeguerin timotheeguerin merged commit e361c3b into master Apr 26, 2018
@timotheeguerin timotheeguerin deleted the feature/readthedocs branch April 26, 2018 22:22
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Setup readthedocs

3 participants