Merge branch 'dev' into rc
8
.github/ISSUE_TEMPLATE/bug_report.md
vendored
|
@ -1,6 +1,6 @@
|
||||||
---
|
---
|
||||||
name: Bug report
|
name: Bug report
|
||||||
about: Create a report to help us improve
|
about: Create a report to help esphomelib improve
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
|
@ -9,7 +9,9 @@ about: Create a report to help us improve
|
||||||
- esphomeyaml [here] - This is mostly for reporting bugs when compiling and when you get a long stack trace while compiling or if a configuration fails to validate.
|
- esphomeyaml [here] - This is mostly for reporting bugs when compiling and when you get a long stack trace while compiling or if a configuration fails to validate.
|
||||||
- esphomelib [https://github.com/OttoWinter/esphomelib] - Report bugs there if the ESP is crashing or a feature is not working as expected.
|
- esphomelib [https://github.com/OttoWinter/esphomelib] - Report bugs there if the ESP is crashing or a feature is not working as expected.
|
||||||
- esphomedocs [https://github.com/OttoWinter/esphomedocs] - Report bugs there if the documentation is wrong/outdated.
|
- esphomedocs [https://github.com/OttoWinter/esphomedocs] - Report bugs there if the documentation is wrong/outdated.
|
||||||
- Provide as many details as possible. Paste logs, configuration sample and code into the backticks (```). Do not delete any text from this template!
|
- Provide as many details as possible. Paste logs, configuration sample and code into the backticks (```).
|
||||||
|
|
||||||
|
DO NOT DELETE ANY TEXT from this template! Otherwise the issue may be closed without a comment.
|
||||||
-->
|
-->
|
||||||
|
|
||||||
**Operating environment (Hass.io/Docker/pip/etc.):**
|
**Operating environment (Hass.io/Docker/pip/etc.):**
|
||||||
|
@ -33,7 +35,7 @@ Please add the link to the documentation at https://esphomelib.com/esphomeyaml/i
|
||||||
|
|
||||||
**Problem-relevant YAML-configuration entries:**
|
**Problem-relevant YAML-configuration entries:**
|
||||||
```yaml
|
```yaml
|
||||||
|
PASTE YAML FILE HERE
|
||||||
```
|
```
|
||||||
|
|
||||||
**Traceback (if applicable):**
|
**Traceback (if applicable):**
|
||||||
|
|
17
.github/ISSUE_TEMPLATE/feature_request.md
vendored
|
@ -7,16 +7,15 @@ about: Suggest an idea for this project
|
||||||
<!-- READ THIS FIRST:
|
<!-- READ THIS FIRST:
|
||||||
- This is for feature requests only, if you want to have a certain new sensor/module supported, please use the "new integration" template.
|
- This is for feature requests only, if you want to have a certain new sensor/module supported, please use the "new integration" template.
|
||||||
- Please be as descriptive as possible, especially use-cases that can otherwise not be solved boost the problem's priority.
|
- Please be as descriptive as possible, especially use-cases that can otherwise not be solved boost the problem's priority.
|
||||||
|
|
||||||
|
DO NOT DELETE ANY TEXT from this template! Otherwise the issue may be closed without a comment.
|
||||||
-->
|
-->
|
||||||
|
|
||||||
**Is your feature request related to a problem? Please describe.**
|
**Is your feature request related to a problem/use-case? Please describe.**
|
||||||
<!--
|
<!-- A clear and concise description of what the problem is. -->
|
||||||
A clear and concise description of what the problem is.
|
|
||||||
-->
|
|
||||||
Ex. I'm always frustrated when [...]
|
|
||||||
|
|
||||||
**Describe the solution you'd like**
|
**Describe the solution you'd like:**
|
||||||
A description of what you want to happen.
|
<!-- A description of what you want to happen. -->
|
||||||
|
|
||||||
**Additional context**
|
**Additional context:**
|
||||||
Add any other context about the feature request here.
|
<!-- Add any other context about the feature request here. -->
|
||||||
|
|
15
.github/ISSUE_TEMPLATE/new-integration.md
vendored
|
@ -4,17 +4,10 @@ about: Suggest a new integration for esphomelib
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
<!-- READ THIS FIRST:
|
DO NOT POST NEW INTEGRATION REQUESTS HERE!
|
||||||
- This is for new integrations (such as new sensors/modules) only, for new features within the environment please use the "feature request" template.
|
|
||||||
- Do not delete anything from this template and fill out the form as precisely as possible.
|
|
||||||
-->
|
|
||||||
|
|
||||||
**What new integration would you wish to have?**
|
Please post all new integration requests in the esphomelib repository:
|
||||||
<!-- A name/description of the new integration/board. -->
|
|
||||||
|
|
||||||
**If possible, provide a link to an existing library for the integration:**
|
https://github.com/OttoWinter/esphomelib/issues
|
||||||
|
|
||||||
**Is your feature request related to a problem? Please describe.**
|
Thank you!
|
||||||
A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]
|
|
||||||
|
|
||||||
**Additional context**
|
|
||||||
|
|
6
.github/PULL_REQUEST_TEMPLATE.md
vendored
|
@ -6,15 +6,9 @@
|
||||||
**Pull request in [esphomedocs](https://github.com/OttoWinter/esphomedocs) with documentation (if applicable):** OttoWinter/esphomedocs#<esphomedocs PR number goes here>
|
**Pull request in [esphomedocs](https://github.com/OttoWinter/esphomedocs) with documentation (if applicable):** OttoWinter/esphomedocs#<esphomedocs PR number goes here>
|
||||||
**Pull request in [esphomelib](https://github.com/OttoWinter/esphomelib) with C++ framework changes (if applicable):** OttoWinter/esphomelib#<esphomelib PR number goes here>
|
**Pull request in [esphomelib](https://github.com/OttoWinter/esphomelib) with C++ framework changes (if applicable):** OttoWinter/esphomelib#<esphomelib PR number goes here>
|
||||||
|
|
||||||
## Example entry for YAML configuration (if applicable):
|
|
||||||
```yaml
|
|
||||||
|
|
||||||
```
|
|
||||||
|
|
||||||
## Checklist:
|
## Checklist:
|
||||||
- [ ] The code change is tested and works locally.
|
- [ ] The code change is tested and works locally.
|
||||||
- [ ] Tests have been added to verify that the new code works (under `tests/` folder).
|
- [ ] Tests have been added to verify that the new code works (under `tests/` folder).
|
||||||
- [ ] Check this box if you have read, understand, comply, and agree with the [Code of Conduct](https://github.com/OttoWinter/esphomeyaml/blob/master/CODE_OF_CONDUCT.md).
|
|
||||||
|
|
||||||
If user exposed functionality or configuration variables are added/changed:
|
If user exposed functionality or configuration variables are added/changed:
|
||||||
- [ ] Documentation added/updated in [esphomedocs](https://github.com/OttoWinter/esphomedocs).
|
- [ ] Documentation added/updated in [esphomedocs](https://github.com/OttoWinter/esphomedocs).
|
||||||
|
|
1
.gitignore
vendored
|
@ -105,3 +105,4 @@ venv.bak/
|
||||||
|
|
||||||
config/
|
config/
|
||||||
tests/build/
|
tests/build/
|
||||||
|
tests/.esphomeyaml/
|
||||||
|
|
|
@ -11,6 +11,8 @@ stages:
|
||||||
|
|
||||||
.lint: &lint
|
.lint: &lint
|
||||||
stage: lint
|
stage: lint
|
||||||
|
before_script:
|
||||||
|
- pip install -e .
|
||||||
tags:
|
tags:
|
||||||
- python2.7
|
- python2.7
|
||||||
- esphomeyaml-lint
|
- esphomeyaml-lint
|
||||||
|
@ -24,9 +26,6 @@ stages:
|
||||||
- esphomeyaml-test
|
- esphomeyaml-test
|
||||||
variables:
|
variables:
|
||||||
TZ: UTC
|
TZ: UTC
|
||||||
cache:
|
|
||||||
paths:
|
|
||||||
- tests/build
|
|
||||||
|
|
||||||
.docker-builder: &docker-builder
|
.docker-builder: &docker-builder
|
||||||
before_script:
|
before_script:
|
||||||
|
@ -62,21 +61,20 @@ test2:
|
||||||
stage: build
|
stage: build
|
||||||
script:
|
script:
|
||||||
- docker run --rm --privileged hassioaddons/qemu-user-static:latest
|
- docker run --rm --privileged hassioaddons/qemu-user-static:latest
|
||||||
- BUILD_FROM=homeassistant/${ADDON_ARCH}-base-ubuntu:latest
|
- BUILD_FROM=hassioaddons/ubuntu-base-${ADDON_ARCH}:2.2.0
|
||||||
- ADDON_VERSION="${CI_COMMIT_TAG#v}"
|
- ADDON_VERSION="${CI_COMMIT_TAG#v}"
|
||||||
- ADDON_VERSION="${ADDON_VERSION:-${CI_COMMIT_SHA:0:7}}"
|
- ADDON_VERSION="${ADDON_VERSION:-${CI_COMMIT_SHA:0:7}}"
|
||||||
- ESPHOMELIB_VERSION="${ESPHOMELIB_VERSION:-dev}"
|
|
||||||
- echo "Build from ${BUILD_FROM}"
|
- echo "Build from ${BUILD_FROM}"
|
||||||
- echo "Add-on version ${ADDON_VERSION}"
|
- echo "Add-on version ${ADDON_VERSION}"
|
||||||
- echo "Esphomelib version ${ESPHOMELIB_VERSION}"
|
|
||||||
- echo "Tag ${CI_REGISTRY}/esphomeyaml-hassio-${ADDON_ARCH}:dev"
|
- echo "Tag ${CI_REGISTRY}/esphomeyaml-hassio-${ADDON_ARCH}:dev"
|
||||||
- echo "Tag ${CI_REGISTRY}/esphomeyaml-hassio-${ADDON_ARCH}:${CI_COMMIT_SHA}"
|
- echo "Tag ${CI_REGISTRY}/esphomeyaml-hassio-${ADDON_ARCH}:${CI_COMMIT_SHA}"
|
||||||
- |
|
- |
|
||||||
docker build \
|
docker build \
|
||||||
--build-arg "BUILD_FROM=${BUILD_FROM}" \
|
--build-arg "BUILD_FROM=${BUILD_FROM}" \
|
||||||
--build-arg "ADDON_ARCH=${ADDON_ARCH}" \
|
--build-arg "BUILD_DATE=$(date +"%Y-%m-%dT%H:%M:%SZ")" \
|
||||||
--build-arg "ADDON_VERSION=${ADDON_VERSION}" \
|
--build-arg "BUILD_ARCH=${ADDON_ARCH}" \
|
||||||
--build-arg "ESPHOMELIB_VERSION=${ESPHOMELIB_VERSION}" \
|
--build-arg "BUILD_REF=${CI_COMMIT_SHA}" \
|
||||||
|
--build-arg "BUILD_VERSION=${ADDON_VERSION}" \
|
||||||
--tag "${CI_REGISTRY}/esphomeyaml-hassio-${ADDON_ARCH}:dev" \
|
--tag "${CI_REGISTRY}/esphomeyaml-hassio-${ADDON_ARCH}:dev" \
|
||||||
--tag "${CI_REGISTRY}/esphomeyaml-hassio-${ADDON_ARCH}:${CI_COMMIT_SHA}" \
|
--tag "${CI_REGISTRY}/esphomeyaml-hassio-${ADDON_ARCH}:${CI_COMMIT_SHA}" \
|
||||||
--file "docker/Dockerfile.hassio" \
|
--file "docker/Dockerfile.hassio" \
|
||||||
|
@ -95,48 +93,48 @@ test2:
|
||||||
script:
|
script:
|
||||||
- version="${CI_COMMIT_TAG#v}"
|
- version="${CI_COMMIT_TAG#v}"
|
||||||
- echo "Publishing release version ${version}"
|
- echo "Publishing release version ${version}"
|
||||||
- docker pull "${CI_REGISTRY}/esphomeyaml-hassio-${ADDON_ARCH}:${CI_COMMIT_SHA}"
|
- docker pull "${CI_REGISTRY}/ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:${CI_COMMIT_SHA}"
|
||||||
- docker login -u "$DOCKER_USER" -p "$DOCKER_PASSWORD"
|
- docker login -u "$DOCKER_USER" -p "$DOCKER_PASSWORD"
|
||||||
|
|
||||||
- echo "Tag ${CI_REGISTRY}/esphomeyaml-hassio-${ADDON_ARCH}:${version}"
|
- echo "Tag ${CI_REGISTRY}/ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:${version}"
|
||||||
- |
|
- |
|
||||||
docker tag \
|
docker tag \
|
||||||
"${CI_REGISTRY}/esphomeyaml-hassio-${ADDON_ARCH}:${CI_COMMIT_SHA}" \
|
"${CI_REGISTRY}/ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:${CI_COMMIT_SHA}" \
|
||||||
"${CI_REGISTRY}/esphomeyaml-hassio-${ADDON_ARCH}:${version}"
|
"${CI_REGISTRY}/ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:${version}"
|
||||||
- docker push "${CI_REGISTRY}/esphomeyaml-hassio-${ADDON_ARCH}:${version}"
|
- docker push "${CI_REGISTRY}/ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:${version}"
|
||||||
|
|
||||||
- echo "Tag ${CI_REGISTRY}/esphomeyaml-hassio-${ADDON_ARCH}:latest"
|
- echo "Tag ${CI_REGISTRY}/ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:latest"
|
||||||
- |
|
- |
|
||||||
docker tag \
|
docker tag \
|
||||||
"${CI_REGISTRY}/esphomeyaml-hassio-${ADDON_ARCH}:${CI_COMMIT_SHA}" \
|
"${CI_REGISTRY}/ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:${CI_COMMIT_SHA}" \
|
||||||
"${CI_REGISTRY}/esphomeyaml-hassio-${ADDON_ARCH}:latest"
|
"${CI_REGISTRY}/ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:latest"
|
||||||
- docker push "${CI_REGISTRY}/esphomeyaml-hassio-${ADDON_ARCH}:latest"
|
- docker push "${CI_REGISTRY}/ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:latest"
|
||||||
|
|
||||||
- echo "Tag ${CI_REGISTRY}/esphomeyaml-hassio-${ADDON_ARCH}:rc"
|
- echo "Tag ${CI_REGISTRY}/ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:rc"
|
||||||
- |
|
- |
|
||||||
docker tag \
|
docker tag \
|
||||||
"${CI_REGISTRY}/esphomeyaml-hassio-${ADDON_ARCH}:${CI_COMMIT_SHA}" \
|
"${CI_REGISTRY}/ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:${CI_COMMIT_SHA}" \
|
||||||
"${CI_REGISTRY}/esphomeyaml-hassio-${ADDON_ARCH}:rc"
|
"${CI_REGISTRY}/ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:rc"
|
||||||
- docker push "${CI_REGISTRY}/esphomeyaml-hassio-${ADDON_ARCH}:rc"
|
- docker push "${CI_REGISTRY}/ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:rc"
|
||||||
|
|
||||||
- echo "Tag ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:${version}"
|
- echo "Tag ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:${version}"
|
||||||
- |
|
- |
|
||||||
docker tag \
|
docker tag \
|
||||||
"${CI_REGISTRY}/esphomeyaml-hassio-${ADDON_ARCH}:${CI_COMMIT_SHA}" \
|
"${CI_REGISTRY}/ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:${CI_COMMIT_SHA}" \
|
||||||
"ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:${version}"
|
"ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:${version}"
|
||||||
- docker push "ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:${version}"
|
- docker push "ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:${version}"
|
||||||
|
|
||||||
- echo "Tag ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:latest"
|
- echo "Tag ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:latest"
|
||||||
- |
|
- |
|
||||||
docker tag \
|
docker tag \
|
||||||
"${CI_REGISTRY}/esphomeyaml-hassio-${ADDON_ARCH}:${CI_COMMIT_SHA}" \
|
"ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:${version}" \
|
||||||
"ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:latest"
|
"ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:latest"
|
||||||
- docker push "ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:latest"
|
- docker push "ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:latest"
|
||||||
|
|
||||||
- echo "Tag ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:rc"
|
- echo "Tag ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:rc"
|
||||||
- |
|
- |
|
||||||
docker tag \
|
docker tag \
|
||||||
"${CI_REGISTRY}/esphomeyaml-hassio-${ADDON_ARCH}:${CI_COMMIT_SHA}" \
|
"ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:${version}" \
|
||||||
"ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:rc"
|
"ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:rc"
|
||||||
- docker push "ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:rc"
|
- docker push "ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:rc"
|
||||||
only:
|
only:
|
||||||
|
@ -150,34 +148,34 @@ test2:
|
||||||
script:
|
script:
|
||||||
- version="${CI_COMMIT_TAG#v}"
|
- version="${CI_COMMIT_TAG#v}"
|
||||||
- echo "Publishing beta version ${version}"
|
- echo "Publishing beta version ${version}"
|
||||||
- docker pull "${CI_REGISTRY}/esphomeyaml-hassio-${ADDON_ARCH}:${CI_COMMIT_SHA}"
|
- docker pull "${CI_REGISTRY}/ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:${CI_COMMIT_SHA}"
|
||||||
- docker login -u "$DOCKER_USER" -p "$DOCKER_PASSWORD"
|
- docker login -u "$DOCKER_USER" -p "$DOCKER_PASSWORD"
|
||||||
|
|
||||||
- echo "Tag ${CI_REGISTRY}/esphomeyaml-hassio-${ADDON_ARCH}:${version}"
|
- echo "Tag ${CI_REGISTRY}/ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:${version}"
|
||||||
- |
|
- |
|
||||||
docker tag \
|
docker tag \
|
||||||
"${CI_REGISTRY}/esphomeyaml-hassio-${ADDON_ARCH}:${CI_COMMIT_SHA}" \
|
"${CI_REGISTRY}/ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:${CI_COMMIT_SHA}" \
|
||||||
"${CI_REGISTRY}/esphomeyaml-hassio-${ADDON_ARCH}:${version}"
|
"${CI_REGISTRY}/ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:${version}"
|
||||||
- docker push "${CI_REGISTRY}/esphomeyaml-hassio-${ADDON_ARCH}:${version}"
|
- docker push "${CI_REGISTRY}/ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:${version}"
|
||||||
|
|
||||||
- echo "Tag ${CI_REGISTRY}/esphomeyaml-hassio-${ADDON_ARCH}:rc"
|
- echo "Tag ${CI_REGISTRY}/ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:rc"
|
||||||
- |
|
- |
|
||||||
docker tag \
|
docker tag \
|
||||||
"${CI_REGISTRY}/esphomeyaml-hassio-${ADDON_ARCH}:${CI_COMMIT_SHA}" \
|
"${CI_REGISTRY}/ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:${CI_COMMIT_SHA}" \
|
||||||
"${CI_REGISTRY}/esphomeyaml-hassio-${ADDON_ARCH}:rc"
|
"${CI_REGISTRY}/ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:rc"
|
||||||
- docker push "${CI_REGISTRY}/esphomeyaml-hassio-${ADDON_ARCH}:rc"
|
- docker push "${CI_REGISTRY}/ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:rc"
|
||||||
|
|
||||||
- echo "Tag ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:${version}"
|
- echo "Tag ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:${version}"
|
||||||
- |
|
- |
|
||||||
docker tag \
|
docker tag \
|
||||||
"${CI_REGISTRY}/esphomeyaml-hassio-${ADDON_ARCH}:${CI_COMMIT_SHA}" \
|
"${CI_REGISTRY}/ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:${CI_COMMIT_SHA}" \
|
||||||
"ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:${version}"
|
"ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:${version}"
|
||||||
- docker push "ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:${version}"
|
- docker push "ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:${version}"
|
||||||
|
|
||||||
- echo "Tag ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:rc"
|
- echo "Tag ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:rc"
|
||||||
- |
|
- |
|
||||||
docker tag \
|
docker tag \
|
||||||
"${CI_REGISTRY}/esphomeyaml-hassio-${ADDON_ARCH}:${CI_COMMIT_SHA}" \
|
"ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:${version}" \
|
||||||
"ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:rc"
|
"ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:rc"
|
||||||
- docker push "ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:rc"
|
- docker push "ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:rc"
|
||||||
only:
|
only:
|
||||||
|
@ -190,7 +188,7 @@ build:normal:
|
||||||
<<: *docker-builder
|
<<: *docker-builder
|
||||||
stage: build
|
stage: build
|
||||||
script:
|
script:
|
||||||
- docker build -t "${CI_REGISTRY}/esphomeyaml:dev" .
|
- docker build -t "${CI_REGISTRY}/ottowinter/esphomeyaml:dev" .
|
||||||
|
|
||||||
.build-hassio-edge: &build-hassio-edge
|
.build-hassio-edge: &build-hassio-edge
|
||||||
<<: *build-hassio
|
<<: *build-hassio
|
||||||
|
@ -214,7 +212,6 @@ build:hassio-armhf:
|
||||||
<<: *build-hassio-release
|
<<: *build-hassio-release
|
||||||
variables:
|
variables:
|
||||||
ADDON_ARCH: armhf
|
ADDON_ARCH: armhf
|
||||||
ESPHOMELIB_VERSION: "${CI_COMMIT_TAG}"
|
|
||||||
|
|
||||||
#build:hassio-aarch64-edge:
|
#build:hassio-aarch64-edge:
|
||||||
# <<: *build-hassio-edge
|
# <<: *build-hassio-edge
|
||||||
|
@ -226,7 +223,6 @@ build:hassio-armhf:
|
||||||
# <<: *build-hassio-release
|
# <<: *build-hassio-release
|
||||||
# variables:
|
# variables:
|
||||||
# ADDON_ARCH: aarch64
|
# ADDON_ARCH: aarch64
|
||||||
# ESPHOMELIB_VERSION: "${CI_COMMIT_TAG}"
|
|
||||||
|
|
||||||
build:hassio-i386-edge:
|
build:hassio-i386-edge:
|
||||||
<<: *build-hassio-edge
|
<<: *build-hassio-edge
|
||||||
|
@ -238,7 +234,6 @@ build:hassio-i386:
|
||||||
<<: *build-hassio-release
|
<<: *build-hassio-release
|
||||||
variables:
|
variables:
|
||||||
ADDON_ARCH: i386
|
ADDON_ARCH: i386
|
||||||
ESPHOMELIB_VERSION: "${CI_COMMIT_TAG}"
|
|
||||||
|
|
||||||
build:hassio-amd64-edge:
|
build:hassio-amd64-edge:
|
||||||
<<: *build-hassio-edge
|
<<: *build-hassio-edge
|
||||||
|
@ -250,7 +245,6 @@ build:hassio-amd64:
|
||||||
<<: *build-hassio-release
|
<<: *build-hassio-release
|
||||||
variables:
|
variables:
|
||||||
ADDON_ARCH: amd64
|
ADDON_ARCH: amd64
|
||||||
ESPHOMELIB_VERSION: "${CI_COMMIT_TAG}"
|
|
||||||
|
|
||||||
# Deploy jobs
|
# Deploy jobs
|
||||||
deploy-release:armhf:
|
deploy-release:armhf:
|
||||||
|
@ -267,7 +261,7 @@ deploy-beta:armhf:
|
||||||
# <<: *deploy-release
|
# <<: *deploy-release
|
||||||
# variables:
|
# variables:
|
||||||
# ADDON_ARCH: aarch64
|
# ADDON_ARCH: aarch64
|
||||||
#
|
|
||||||
#deploy-beta:aarch64:
|
#deploy-beta:aarch64:
|
||||||
# <<: *deploy-beta
|
# <<: *deploy-beta
|
||||||
# variables:
|
# variables:
|
||||||
|
|
32
.travis.yml
|
@ -1,20 +1,30 @@
|
||||||
sudo: false
|
sudo: false
|
||||||
language: python
|
language: python
|
||||||
python:
|
|
||||||
- "2.7"
|
matrix:
|
||||||
jobs:
|
fast_finish: true
|
||||||
include:
|
include:
|
||||||
- name: "Lint"
|
- python: "2.7"
|
||||||
install:
|
env: TARGET=Lint2.7
|
||||||
- pip install -r requirements.txt
|
install: pip install -e . && pip install flake8==3.6.0 pylint==1.9.4 pillow
|
||||||
- pip install flake8==3.5.0 pylint==1.9.3 tzlocal pillow
|
|
||||||
script:
|
script:
|
||||||
- flake8 esphomeyaml
|
- flake8 esphomeyaml
|
||||||
- pylint esphomeyaml
|
- pylint esphomeyaml
|
||||||
- name: "Test"
|
- python: "3.5.3"
|
||||||
install:
|
env: TARGET=Lint3.5
|
||||||
- pip install -e .
|
install: pip install -U https://github.com/platformio/platformio-core/archive/develop.zip && pip install -e . && pip install flake8==3.6.0 pylint==2.2.2 pillow
|
||||||
- pip install tzlocal pillow
|
script:
|
||||||
|
- flake8 esphomeyaml
|
||||||
|
- pylint esphomeyaml
|
||||||
|
- python: "2.7"
|
||||||
|
env: TARGET=Test2.7
|
||||||
|
install: pip install -e . && pip install flake8==3.6.0 pylint==1.9.4 pillow
|
||||||
|
script:
|
||||||
|
- esphomeyaml tests/test1.yaml compile
|
||||||
|
- esphomeyaml tests/test2.yaml compile
|
||||||
|
- python: "3.5.3"
|
||||||
|
env: TARGET=Test3.5
|
||||||
|
install: pip install -U https://github.com/platformio/platformio-core/archive/develop.zip && pip install -e . && pip install flake8==3.6.0 pylint==2.2.2 pillow
|
||||||
script:
|
script:
|
||||||
- esphomeyaml tests/test1.yaml compile
|
- esphomeyaml tests/test1.yaml compile
|
||||||
- esphomeyaml tests/test2.yaml compile
|
- esphomeyaml tests/test2.yaml compile
|
||||||
|
|
|
@ -21,8 +21,7 @@ COPY docker/platformio.ini /pio/platformio.ini
|
||||||
RUN platformio run -d /pio; rm -rf /pio
|
RUN platformio run -d /pio; rm -rf /pio
|
||||||
|
|
||||||
COPY . .
|
COPY . .
|
||||||
RUN pip install --no-cache-dir --no-binary :all: -e . && \
|
RUN pip install --no-cache-dir --no-binary :all: -e .
|
||||||
pip install --no-cache-dir --no-binary :all: tzlocal
|
|
||||||
|
|
||||||
WORKDIR /config
|
WORKDIR /config
|
||||||
ENTRYPOINT ["esphomeyaml"]
|
ENTRYPOINT ["esphomeyaml"]
|
||||||
|
|
13
MANIFEST.in
|
@ -1,4 +1,17 @@
|
||||||
include README.md
|
include README.md
|
||||||
include esphomeyaml/dashboard/templates/index.html
|
include esphomeyaml/dashboard/templates/index.html
|
||||||
|
include esphomeyaml/dashboard/templates/login.html
|
||||||
|
include esphomeyaml/dashboard/static/ace.js
|
||||||
|
include esphomeyaml/dashboard/static/esphomeyaml.css
|
||||||
|
include esphomeyaml/dashboard/static/esphomeyaml.js
|
||||||
|
include esphomeyaml/dashboard/static/favicon.ico
|
||||||
|
include esphomeyaml/dashboard/static/jquery.min.js
|
||||||
|
include esphomeyaml/dashboard/static/jquery.validate.min.js
|
||||||
|
include esphomeyaml/dashboard/static/jquery-ui.min.js
|
||||||
|
include esphomeyaml/dashboard/static/materialize.min.css
|
||||||
|
include esphomeyaml/dashboard/static/materialize.min.js
|
||||||
include esphomeyaml/dashboard/static/materialize-stepper.min.css
|
include esphomeyaml/dashboard/static/materialize-stepper.min.css
|
||||||
include esphomeyaml/dashboard/static/materialize-stepper.min.js
|
include esphomeyaml/dashboard/static/materialize-stepper.min.js
|
||||||
|
include esphomeyaml/dashboard/static/mode-yaml.js
|
||||||
|
include esphomeyaml/dashboard/static/theme-dreamweaver.js
|
||||||
|
include esphomeyaml/dashboard/static/ext-searchbox.js
|
||||||
|
|
|
@ -1,42 +1,75 @@
|
||||||
# Dockerfile for HassIO add-on
|
ARG BUILD_FROM=hassioaddons/ubuntu-base:2.2.0
|
||||||
ARG BUILD_FROM=homeassistant/amd64-base-ubuntu:latest
|
# hadolint ignore=DL3006
|
||||||
FROM ${BUILD_FROM}
|
FROM ${BUILD_FROM}
|
||||||
|
|
||||||
RUN apt-get update && apt-get install -y --no-install-recommends \
|
# Set shell
|
||||||
|
SHELL ["/bin/bash", "-o", "pipefail", "-c"]
|
||||||
|
|
||||||
|
# Copy root filesystem
|
||||||
|
COPY esphomeyaml-edge/rootfs /
|
||||||
|
COPY setup.py setup.cfg MANIFEST.in /opt/esphomeyaml/
|
||||||
|
COPY esphomeyaml /opt/esphomeyaml/esphomeyaml
|
||||||
|
|
||||||
|
RUN \
|
||||||
|
# Temporarily move nginx.conf (otherwise dpkg fails)
|
||||||
|
mv /etc/nginx/nginx.conf /etc/nginx/nginx.conf.bkp \
|
||||||
|
# Install add-on dependencies
|
||||||
|
&& apt-get update \
|
||||||
|
&& apt-get install -y --no-install-recommends \
|
||||||
|
# Python for esphomeyaml
|
||||||
python \
|
python \
|
||||||
python-pip \
|
python-pip \
|
||||||
python-setuptools \
|
python-setuptools \
|
||||||
|
# Python Pillow for display component
|
||||||
python-pil \
|
python-pil \
|
||||||
|
# Git for esphomelib downloads
|
||||||
git \
|
git \
|
||||||
&& apt-get clean && rm -rf /var/lib/apt/lists/* /tmp/* && \
|
# Ping for dashboard online/offline status
|
||||||
pip install --no-cache-dir --no-binary :all: platformio && \
|
iputils-ping \
|
||||||
platformio settings set enable_telemetry No && \
|
# NGINX proxy
|
||||||
platformio settings set check_libraries_interval 1000000 && \
|
nginx \
|
||||||
platformio settings set check_platformio_interval 1000000 && \
|
\
|
||||||
platformio settings set check_platforms_interval 1000000
|
&& mv /etc/nginx/nginx.conf.bkp /etc/nginx/nginx.conf \
|
||||||
|
\
|
||||||
COPY docker/platformio.ini /pio/platformio.ini
|
&& pip2 install --no-cache-dir --no-binary :all: -e /opt/esphomeyaml \
|
||||||
RUN platformio run -d /pio; rm -rf /pio
|
\
|
||||||
|
# Change some platformio settings
|
||||||
ARG ESPHOMELIB_VERSION="dev"
|
&& platformio settings set enable_telemetry No \
|
||||||
RUN platformio lib -g install "https://github.com/OttoWinter/esphomelib.git#${ESPHOMELIB_VERSION}"
|
&& platformio settings set check_libraries_interval 1000000 \
|
||||||
|
&& platformio settings set check_platformio_interval 1000000 \
|
||||||
COPY . .
|
&& platformio settings set check_platforms_interval 1000000 \
|
||||||
RUN pip install --no-cache-dir --no-binary :all: -e . && \
|
\
|
||||||
pip install --no-cache-dir --no-binary :all: tzlocal
|
# Build an empty platformio project to force platformio to install all fw build dependencies
|
||||||
|
# The return-code will be non-zero since there's nothing to build.
|
||||||
CMD ["esphomeyaml", "/config/esphomeyaml", "dashboard"]
|
&& (platformio run -d /opt/pio; echo "Done") \
|
||||||
|
\
|
||||||
|
# Cleanup
|
||||||
|
&& rm -fr \
|
||||||
|
/tmp/* \
|
||||||
|
/var/{cache,log}/* \
|
||||||
|
/var/lib/apt/lists/* \
|
||||||
|
/opt/pio/
|
||||||
|
|
||||||
# Build arugments
|
# Build arugments
|
||||||
ARG ADDON_ARCH
|
ARG BUILD_ARCH=amd64
|
||||||
ARG ADDON_VERSION
|
ARG BUILD_DATE
|
||||||
|
ARG BUILD_REF
|
||||||
|
ARG BUILD_VERSION
|
||||||
|
|
||||||
# Labels
|
# Labels
|
||||||
LABEL \
|
LABEL \
|
||||||
io.hass.name="esphomeyaml" \
|
io.hass.name="esphomeyaml" \
|
||||||
io.hass.description="esphomeyaml HassIO add-on for intelligently managing all your ESP8266/ESP32 devices." \
|
io.hass.description="Manage and program ESP8266/ESP32 microcontrollers through YAML configuration files" \
|
||||||
io.hass.arch="${ADDON_ARCH}" \
|
io.hass.arch="${BUILD_ARCH}" \
|
||||||
io.hass.type="addon" \
|
io.hass.type="addon" \
|
||||||
io.hass.version="${ADDON_VERSION}" \
|
io.hass.version=${BUILD_VERSION} \
|
||||||
io.hass.url="https://esphomelib.com/esphomeyaml/index.html" \
|
maintainer="Otto Winter <contact@otto-winter.com>" \
|
||||||
maintainer="Otto Winter <contact@otto-winter.com>"
|
org.label-schema.description="Manage and program ESP8266/ESP32 microcontrollers through YAML configuration files" \
|
||||||
|
org.label-schema.build-date=${BUILD_DATE} \
|
||||||
|
org.label-schema.name="esphomeyaml" \
|
||||||
|
org.label-schema.schema-version="1.0" \
|
||||||
|
org.label-schema.url="https://esphomelib.com" \
|
||||||
|
org.label-schema.usage="https://github.com/OttoWinter/esphomeyaml/tree/dev/esphomeyaml/README.md" \
|
||||||
|
org.label-schema.vcs-ref=${BUILD_REF} \
|
||||||
|
org.label-schema.vcs-url="https://github.com/OttoWinter/esphomeyaml" \
|
||||||
|
org.label-schema.vendor="esphomelib"
|
|
@ -3,4 +3,4 @@ FROM python:2.7
|
||||||
COPY requirements.txt /requirements.txt
|
COPY requirements.txt /requirements.txt
|
||||||
|
|
||||||
RUN pip install -r /requirements.txt && \
|
RUN pip install -r /requirements.txt && \
|
||||||
pip install flake8==3.5.0 pylint==1.9.3 tzlocal pillow
|
pip install flake8==3.6.0 pylint==1.9.4 pillow
|
||||||
|
|
|
@ -8,12 +8,14 @@ RUN apt-get update && apt-get install -y --no-install-recommends \
|
||||||
git \
|
git \
|
||||||
&& apt-get clean && rm -rf /var/lib/apt/lists/* /tmp/*rm -rf /var/lib/apt/lists/* /tmp/* && \
|
&& apt-get clean && rm -rf /var/lib/apt/lists/* /tmp/*rm -rf /var/lib/apt/lists/* /tmp/* && \
|
||||||
pip install --no-cache-dir --no-binary :all: platformio && \
|
pip install --no-cache-dir --no-binary :all: platformio && \
|
||||||
platformio settings set enable_telemetry No
|
platformio settings set enable_telemetry No && \
|
||||||
|
platformio settings set check_libraries_interval 1000000 && \
|
||||||
|
platformio settings set check_platformio_interval 1000000 && \
|
||||||
|
platformio settings set check_platforms_interval 1000000
|
||||||
|
|
||||||
COPY docker/platformio.ini /pio/platformio.ini
|
COPY docker/platformio.ini /pio/platformio.ini
|
||||||
RUN platformio run -d /pio; rm -rf /pio
|
RUN platformio run -d /pio; rm -rf /pio
|
||||||
|
|
||||||
COPY requirements.txt /requirements.txt
|
COPY requirements.txt /requirements.txt
|
||||||
|
|
||||||
RUN pip install --no-cache-dir -r /requirements.txt && \
|
RUN pip install --no-cache-dir -r /requirements.txt
|
||||||
pip install --no-cache-dir tzlocal pillow
|
|
||||||
|
|
|
@ -1,8 +1,8 @@
|
||||||
{
|
{
|
||||||
"name": "esphomeyaml-beta",
|
"name": "esphomeyaml-beta",
|
||||||
"version": "1.9.0b5",
|
"version": "1.9.3",
|
||||||
"slug": "esphomeyaml-beta",
|
"slug": "esphomeyaml-beta",
|
||||||
"description": "Beta version of esphomeyaml HassIO add-on.",
|
"description": "Beta version of esphomeyaml Hass.io add-on.",
|
||||||
"url": "https://beta.esphomelib.com/esphomeyaml/index.html",
|
"url": "https://beta.esphomelib.com/esphomeyaml/index.html",
|
||||||
"startup": "application",
|
"startup": "application",
|
||||||
"webui": "http://[HOST]:[PORT:6052]",
|
"webui": "http://[HOST]:[PORT:6052]",
|
||||||
|
|
BIN
esphomeyaml-beta/icon.png
Normal file
After Width: | Height: | Size: 2.8 KiB |
BIN
esphomeyaml-beta/logo.png
Normal file
After Width: | Height: | Size: 5.5 KiB |
|
@ -1,24 +1,73 @@
|
||||||
# Dockerfile for HassIO edge add-on
|
ARG BUILD_FROM=hassioaddons/ubuntu-base:2.2.0
|
||||||
ARG BUILD_FROM=homeassistant/amd64-base-ubuntu:latest
|
# hadolint ignore=DL3006
|
||||||
FROM ${BUILD_FROM}
|
FROM ${BUILD_FROM}
|
||||||
|
|
||||||
RUN apt-get update && apt-get install -y --no-install-recommends \
|
# Set shell
|
||||||
|
SHELL ["/bin/bash", "-o", "pipefail", "-c"]
|
||||||
|
|
||||||
|
# Copy root filesystem
|
||||||
|
COPY rootfs /
|
||||||
|
|
||||||
|
RUN \
|
||||||
|
# Temporarily move nginx.conf (otherwise dpkg fails)
|
||||||
|
mv /etc/nginx/nginx.conf /etc/nginx/nginx.conf.bkp \
|
||||||
|
# Install add-on dependencies
|
||||||
|
&& apt-get update \
|
||||||
|
&& apt-get install -y --no-install-recommends \
|
||||||
|
# Python for esphomeyaml
|
||||||
python \
|
python \
|
||||||
python-pip \
|
python-pip \
|
||||||
python-setuptools \
|
python-setuptools \
|
||||||
|
# Python Pillow for display component
|
||||||
python-pil \
|
python-pil \
|
||||||
|
# Git for esphomelib downloads
|
||||||
git \
|
git \
|
||||||
&& apt-get clean && rm -rf /var/lib/apt/lists/* /tmp/* && \
|
# Ping for dashboard online/offline status
|
||||||
pip install --no-cache-dir --no-binary :all: platformio && \
|
iputils-ping \
|
||||||
platformio settings set enable_telemetry No && \
|
# NGINX proxy
|
||||||
platformio settings set check_libraries_interval 1000000 && \
|
nginx \
|
||||||
platformio settings set check_platformio_interval 1000000 && \
|
\
|
||||||
platformio settings set check_platforms_interval 1000000
|
&& mv /etc/nginx/nginx.conf.bkp /etc/nginx/nginx.conf \
|
||||||
|
\
|
||||||
|
&& pip2 install --no-cache-dir --no-binary :all: https://github.com/OttoWinter/esphomeyaml/archive/dev.zip \
|
||||||
|
\
|
||||||
|
# Change some platformio settings
|
||||||
|
&& platformio settings set enable_telemetry No \
|
||||||
|
&& platformio settings set check_libraries_interval 1000000 \
|
||||||
|
&& platformio settings set check_platformio_interval 1000000 \
|
||||||
|
&& platformio settings set check_platforms_interval 1000000 \
|
||||||
|
\
|
||||||
|
# Build an empty platformio project to force platformio to install all fw build dependencies
|
||||||
|
# The return-code will be non-zero since there's nothing to build.
|
||||||
|
&& (platformio run -d /opt/pio; echo "Done") \
|
||||||
|
\
|
||||||
|
# Cleanup
|
||||||
|
&& rm -fr \
|
||||||
|
/tmp/* \
|
||||||
|
/var/{cache,log}/* \
|
||||||
|
/var/lib/apt/lists/* \
|
||||||
|
/opt/pio/
|
||||||
|
|
||||||
COPY platformio.ini /pio/platformio.ini
|
# Build arugments
|
||||||
RUN platformio run -d /pio; rm -rf /pio
|
ARG BUILD_ARCH=amd64
|
||||||
|
ARG BUILD_DATE
|
||||||
|
ARG BUILD_REF
|
||||||
|
ARG BUILD_VERSION
|
||||||
|
|
||||||
RUN pip install --no-cache-dir git+https://github.com/OttoWinter/esphomeyaml.git@dev#egg=esphomeyaml && \
|
# Labels
|
||||||
pip install --no-cache-dir pillow tzlocal
|
LABEL \
|
||||||
|
io.hass.name="esphomeyaml-edge" \
|
||||||
CMD ["esphomeyaml", "/config/esphomeyaml", "dashboard"]
|
io.hass.description="Manage and program ESP8266/ESP32 microcontrollers through YAML configuration files" \
|
||||||
|
io.hass.arch="${BUILD_ARCH}" \
|
||||||
|
io.hass.type="addon" \
|
||||||
|
io.hass.version=${BUILD_VERSION} \
|
||||||
|
maintainer="Otto Winter <contact@otto-winter.com>" \
|
||||||
|
org.label-schema.description="Manage and program ESP8266/ESP32 microcontrollers through YAML configuration files" \
|
||||||
|
org.label-schema.build-date=${BUILD_DATE} \
|
||||||
|
org.label-schema.name="esphomeyaml-edge" \
|
||||||
|
org.label-schema.schema-version="1.0" \
|
||||||
|
org.label-schema.url="https://esphomelib.com" \
|
||||||
|
org.label-schema.usage="https://github.com/OttoWinter/esphomeyaml/tree/dev/esphomeyaml-edge/README.md" \
|
||||||
|
org.label-schema.vcs-ref=${BUILD_REF} \
|
||||||
|
org.label-schema.vcs-url="https://github.com/OttoWinter/esphomeyaml" \
|
||||||
|
org.label-schema.vendor="esphomelib"
|
||||||
|
|
109
esphomeyaml-edge/README.md
Normal file
|
@ -0,0 +1,109 @@
|
||||||
|
# Esphomeyaml Hass.io Add-On
|
||||||
|
|
||||||
|
[![esphomeyaml logo](https://raw.githubusercontent.com/OttoWinter/esphomeyaml/dev/esphomeyaml-edge/logo.png)](https://esphomelib.com/esphomeyaml/index.html)
|
||||||
|
|
||||||
|
[![GitHub stars](https://img.shields.io/github/stars/OttoWinter/esphomelib.svg?style=social&label=Star&maxAge=2592000)](https://github.com/OttoWinter/esphomelib)
|
||||||
|
[![GitHub Release][releases-shield]][releases]
|
||||||
|
[![Discord][discord-shield]][discord]
|
||||||
|
|
||||||
|
## About
|
||||||
|
|
||||||
|
This add-on allows you to manage and program your ESP8266 and ESP32 based microcontrollers
|
||||||
|
directly through Hass.io **with no programming experience required**. All you need to do
|
||||||
|
is write YAML configuration files; the rest (over-the-air updates, compiling) is all
|
||||||
|
handled by esphomeyaml.
|
||||||
|
|
||||||
|
<p align="center">
|
||||||
|
<img title="esphomeyaml dashboard screenshot" src="https://raw.githubusercontent.com/OttoWinter/esphomeyaml/dev/esphomeyaml-edge/images/screenshot.png" width="700px"></img>
|
||||||
|
</p>
|
||||||
|
|
||||||
|
[_View the esphomeyaml documentation here_](https://esphomelib.com/esphomeyaml/index.html)
|
||||||
|
|
||||||
|
## Example
|
||||||
|
|
||||||
|
With esphomeyaml, you can go from a few lines of YAML straight to a custom-made
|
||||||
|
firmware. For example, to include a [DHT22][dht22].
|
||||||
|
temperature and humidity sensor, you just need to include 8 lines of YAML
|
||||||
|
in your configuration file:
|
||||||
|
|
||||||
|
<img title="esphomeyaml DHT configuration example" src="https://raw.githubusercontent.com/OttoWinter/esphomeyaml/dev/esphomeyaml-edge/images/dht-example.png" width="500px"></img>
|
||||||
|
|
||||||
|
Then just click UPLOAD and the sensor will magically appear in Home Assistant:
|
||||||
|
|
||||||
|
<img title="esphomelib Home Assistant MQTT discovery" src="https://raw.githubusercontent.com/OttoWinter/esphomeyaml/dev/esphomeyaml-edge/images/temperature-humidity.png" width="600px"></img>
|
||||||
|
|
||||||
|
## Installation
|
||||||
|
|
||||||
|
To install this Hass.io add-on you need to add the esphomeyaml add-on repository
|
||||||
|
first:
|
||||||
|
|
||||||
|
1. Add the epshomeyaml add-ons repository to your Hass.io instance. You can do this by navigating to the "Add-on Store" tab in the Hass.io panel and then entering https://github.com/OttoWinter/esphomeyaml in the "Add new repository by URL" field.
|
||||||
|
2. Now scroll down and select the "esphomeyaml" add-on.
|
||||||
|
3. Press install to download the add-on and unpack it on your machine. This can take some time.
|
||||||
|
4. Optional: If you're using SSL certificates and want to encrypt your communication to this add-on, please enter `true` into the `ssl` field and set the `fullchain` and `certfile` options accordingly.
|
||||||
|
5. Start the add-on, check the logs of the add-on to see if everything went well.
|
||||||
|
6. Click "OPEN WEB UI" to open the esphomeyaml dashboard. You will be asked for your Home Assistant credentials - esphomeyaml uses Hass.io's authentication system to log you in.
|
||||||
|
|
||||||
|
**NOTE**: Installation on RPis running in 64-bit mode is currently not possible. Please use the 32-bit variant of HassOS instead.
|
||||||
|
|
||||||
|
You can view the esphomeyaml docs here: https://esphomelib.com/esphomeyaml/index.html
|
||||||
|
|
||||||
|
## Configuration
|
||||||
|
|
||||||
|
**Note**: _Remember to restart the add-on when the configuration is changed._
|
||||||
|
|
||||||
|
Example add-on configuration:
|
||||||
|
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"ssl": false,
|
||||||
|
"certfile": "fullchain.pem",
|
||||||
|
"keyfile": "privkey.pem",
|
||||||
|
"port": 6052
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### Option: `port`
|
||||||
|
|
||||||
|
The port to start the dashboard server on. Default is 6052.
|
||||||
|
|
||||||
|
### Option: `ssl`
|
||||||
|
|
||||||
|
Enables/Disables encrypted SSL (HTTPS) connections to the web server of this add-on.
|
||||||
|
Set it to `true` to encrypt communications, `false` otherwise.
|
||||||
|
Please note that if you set this to `true` you must also generate the key and certificate
|
||||||
|
files for encryption. For example using [Let's Encrypt](https://www.home-assistant.io/addons/lets_encrypt/)
|
||||||
|
or [Self-signed certificates](https://www.home-assistant.io/docs/ecosystem/certificates/tls_self_signed_certificate/).
|
||||||
|
|
||||||
|
### Option: `certfile`
|
||||||
|
|
||||||
|
The certificate file to use for SSL. If this file doesn't exist, the add-on start will fail.
|
||||||
|
|
||||||
|
**Note**: The file MUST be stored in `/ssl/`, which is the default for Hass.io
|
||||||
|
|
||||||
|
### Option: `keyfile`
|
||||||
|
|
||||||
|
The private key file to use for SSL. If this file doesn't exist, the add-on start will fail.
|
||||||
|
|
||||||
|
**Note**: The file MUST be stored in `/ssl/`, which is the default for Hass.io
|
||||||
|
|
||||||
|
### Option: `leave_front_door_open`
|
||||||
|
|
||||||
|
Adding this option to the add-on configuration allows you to disable
|
||||||
|
authentication by setting it to `true`.
|
||||||
|
|
||||||
|
### Option: `esphomeyaml_version`
|
||||||
|
|
||||||
|
Manually override which esphomeyaml version to use in the addon.
|
||||||
|
For example to install the latest development version, use `"esphomeyaml_version": "dev"`,
|
||||||
|
or for version 1.10.0: `"esphomeyaml_version": "v1.10.0""`.
|
||||||
|
|
||||||
|
Please note that this does not always work and is only meant for testing, usually the
|
||||||
|
esphomeyaml add-on and dashboard version must match to guarantee a working system.
|
||||||
|
|
||||||
|
[discord-shield]: https://img.shields.io/discord/429907082951524364.svg
|
||||||
|
[dht22]: https://esphomelib.com/esphomeyaml/components/sensor/dht.html
|
||||||
|
[discord]: https://discord.me/KhAMKrd
|
||||||
|
[releases-shield]: https://img.shields.io/github/release/OttoWinter/esphomeyaml.svg
|
||||||
|
[releases]: https://esphomelib.com/esphomeyaml/changelog/index.html
|
||||||
|
[repository]: https://github.com/OttoWinter/esphomeyaml
|
|
@ -1,10 +1,10 @@
|
||||||
{
|
{
|
||||||
"squash": false,
|
"squash": false,
|
||||||
"build_from": {
|
"build_from": {
|
||||||
"aarch64": "homeassistant/aarch64-base-ubuntu:latest",
|
"aarch64": "hassioaddons/ubuntu-base-aarch64:2.2.0",
|
||||||
"amd64": "homeassistant/amd64-base-ubuntu:latest",
|
"amd64": "hassioaddons/ubuntu-base-amd64:2.2.0",
|
||||||
"armhf": "homeassistant/armhf-base-ubuntu:latest",
|
"armhf": "hassioaddons/ubuntu-base-armhf:2.2.0",
|
||||||
"i386": "homeassistant/i386-base-ubuntu:latest"
|
"i386": "hassioaddons/ubuntu-base-i386:2.2.0"
|
||||||
},
|
},
|
||||||
"args": {}
|
"args": {}
|
||||||
}
|
}
|
||||||
|
|
|
@ -2,32 +2,38 @@
|
||||||
"name": "esphomeyaml-edge",
|
"name": "esphomeyaml-edge",
|
||||||
"version": "dev",
|
"version": "dev",
|
||||||
"slug": "esphomeyaml-edge",
|
"slug": "esphomeyaml-edge",
|
||||||
"description": "Development build of the esphomeyaml HassIO add-on.",
|
"description": "Development Version! Manage and program ESP8266/ESP32 microcontrollers through YAML configuration files",
|
||||||
"url": "https://esphomelib.com/esphomeyaml/index.html",
|
"url": "https://github.com/OttoWinter/esphomeyaml/tree/dev/esphomeyaml-edge/README.md",
|
||||||
"startup": "application",
|
|
||||||
"webui": "http://[HOST]:[PORT:6052]",
|
"webui": "http://[HOST]:[PORT:6052]",
|
||||||
"boot": "auto",
|
"startup": "application",
|
||||||
"ports": {
|
|
||||||
"6052/tcp": 6052,
|
|
||||||
"6053/tcp": 6053
|
|
||||||
},
|
|
||||||
"arch": [
|
"arch": [
|
||||||
"aarch64",
|
"aarch64",
|
||||||
"amd64",
|
"amd64",
|
||||||
"armhf",
|
"armhf",
|
||||||
"i386"
|
"i386"
|
||||||
],
|
],
|
||||||
"auto_uart": true,
|
"hassio_api": true,
|
||||||
|
"auth_api": true,
|
||||||
|
"hassio_role": "default",
|
||||||
|
"homeassistant_api": false,
|
||||||
|
"host_network": true,
|
||||||
|
"boot": "auto",
|
||||||
"map": [
|
"map": [
|
||||||
|
"ssl",
|
||||||
"config:rw"
|
"config:rw"
|
||||||
],
|
],
|
||||||
"options": {
|
"options": {
|
||||||
"password": ""
|
"ssl": false,
|
||||||
|
"certfile": "fullchain.pem",
|
||||||
|
"keyfile": "privkey.pem",
|
||||||
|
"port": 6052
|
||||||
},
|
},
|
||||||
"schema": {
|
"schema": {
|
||||||
"password": "str?"
|
"ssl": "bool",
|
||||||
},
|
"certfile": "str",
|
||||||
"environment": {
|
"keyfile": "str",
|
||||||
"ESPHOMEYAML_OTA_HOST_PORT": "6053"
|
"port": "int",
|
||||||
|
"leave_front_door_open": "bool?",
|
||||||
|
"esphomeyaml_version": "str?"
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
BIN
esphomeyaml-edge/icon.png
Normal file
After Width: | Height: | Size: 2.8 KiB |
BIN
esphomeyaml-edge/images/dht-example.png
Normal file
After Width: | Height: | Size: 17 KiB |
BIN
esphomeyaml-edge/images/screenshot.png
Normal file
After Width: | Height: | Size: 50 KiB |
BIN
esphomeyaml-edge/images/temperature-humidity.png
Normal file
After Width: | Height: | Size: 5.3 KiB |
BIN
esphomeyaml-edge/logo.png
Normal file
After Width: | Height: | Size: 8.6 KiB |
35
esphomeyaml-edge/rootfs/etc/cont-init.d/10-requirements.sh
Executable file
|
@ -0,0 +1,35 @@
|
||||||
|
#!/usr/bin/with-contenv bash
|
||||||
|
# ==============================================================================
|
||||||
|
# Community Hass.io Add-ons: esphomeyaml
|
||||||
|
# This files check if all user configuration requirements are met
|
||||||
|
# ==============================================================================
|
||||||
|
# shellcheck disable=SC1091
|
||||||
|
source /usr/lib/hassio-addons/base.sh
|
||||||
|
|
||||||
|
# Check SSL requirements, if enabled
|
||||||
|
if hass.config.true 'ssl'; then
|
||||||
|
if ! hass.config.has_value 'certfile'; then
|
||||||
|
hass.die 'SSL is enabled, but no certfile was specified.'
|
||||||
|
fi
|
||||||
|
|
||||||
|
if ! hass.config.has_value 'keyfile'; then
|
||||||
|
hass.die 'SSL is enabled, but no keyfile was specified'
|
||||||
|
fi
|
||||||
|
|
||||||
|
if ! hass.file_exists "/ssl/$(hass.config.get 'certfile')"; then
|
||||||
|
if ! hass.file_exists "/ssl/$(hass.config.get 'keyfile')"; then
|
||||||
|
# Both files are missing, let's print a friendlier error message
|
||||||
|
text="You enabled encrypted connections using the \"ssl\": true option.
|
||||||
|
However, the SSL files \"$(hass.config.get 'certfile')\" and \"$(hass.config.get 'keyfile')\"
|
||||||
|
were not found. If you're using Hass.io on your local network and don't want
|
||||||
|
to encrypt connections to the esphomeyaml dashboard, you can manually disable
|
||||||
|
SSL by setting \"ssl\" to false."
|
||||||
|
hass.die "${text}"
|
||||||
|
fi
|
||||||
|
hass.die 'The configured certfile is not found'
|
||||||
|
fi
|
||||||
|
|
||||||
|
if ! hass.file_exists "/ssl/$(hass.config.get 'keyfile')"; then
|
||||||
|
hass.die 'The configured keyfile is not found'
|
||||||
|
fi
|
||||||
|
fi
|
28
esphomeyaml-edge/rootfs/etc/cont-init.d/20-nginx.sh
Executable file
|
@ -0,0 +1,28 @@
|
||||||
|
#!/usr/bin/with-contenv bash
|
||||||
|
# ==============================================================================
|
||||||
|
# Community Hass.io Add-ons: esphomeyaml
|
||||||
|
# Configures NGINX for use with esphomeyaml
|
||||||
|
# ==============================================================================
|
||||||
|
# shellcheck disable=SC1091
|
||||||
|
source /usr/lib/hassio-addons/base.sh
|
||||||
|
|
||||||
|
declare certfile
|
||||||
|
declare keyfile
|
||||||
|
declare port
|
||||||
|
|
||||||
|
mkdir -p /var/log/nginx
|
||||||
|
|
||||||
|
# Enable SSL
|
||||||
|
if hass.config.true 'ssl'; then
|
||||||
|
rm /etc/nginx/nginx.conf
|
||||||
|
mv /etc/nginx/nginx-ssl.conf /etc/nginx/nginx.conf
|
||||||
|
|
||||||
|
certfile=$(hass.config.get 'certfile')
|
||||||
|
keyfile=$(hass.config.get 'keyfile')
|
||||||
|
|
||||||
|
sed -i "s/%%certfile%%/${certfile}/g" /etc/nginx/nginx.conf
|
||||||
|
sed -i "s/%%keyfile%%/${keyfile}/g" /etc/nginx/nginx.conf
|
||||||
|
fi
|
||||||
|
|
||||||
|
port=$(hass.config.get 'port')
|
||||||
|
sed -i "s/%%port%%/${port}/g" /etc/nginx/nginx.conf
|
14
esphomeyaml-edge/rootfs/etc/cont-init.d/30-esphomeyaml.sh
Normal file
|
@ -0,0 +1,14 @@
|
||||||
|
#!/usr/bin/with-contenv bash
|
||||||
|
# ==============================================================================
|
||||||
|
# Community Hass.io Add-ons: esphomeyaml
|
||||||
|
# This files installs the user esphomeyaml version if specified
|
||||||
|
# ==============================================================================
|
||||||
|
# shellcheck disable=SC1091
|
||||||
|
source /usr/lib/hassio-addons/base.sh
|
||||||
|
|
||||||
|
declare esphomeyaml_version
|
||||||
|
|
||||||
|
if hass.config.has_value 'esphomeyaml_version'; then
|
||||||
|
esphomeyaml_version=$(hass.config.get 'esphomeyaml_version')
|
||||||
|
pip2 install --no-cache-dir --no-binary :all: "https://github.com/OttoWinter/esphomeyaml/archive/${esphomeyaml_version}.zip"
|
||||||
|
fi
|
62
esphomeyaml-edge/rootfs/etc/nginx/nginx-ssl.conf
Executable file
|
@ -0,0 +1,62 @@
|
||||||
|
worker_processes 1;
|
||||||
|
pid /var/run/nginx.pid;
|
||||||
|
error_log stderr;
|
||||||
|
|
||||||
|
events {
|
||||||
|
worker_connections 1024;
|
||||||
|
}
|
||||||
|
|
||||||
|
http {
|
||||||
|
access_log stdout;
|
||||||
|
include mime.types;
|
||||||
|
default_type application/octet-stream;
|
||||||
|
sendfile on;
|
||||||
|
keepalive_timeout 65;
|
||||||
|
|
||||||
|
upstream esphomeyaml {
|
||||||
|
ip_hash;
|
||||||
|
server unix:/var/run/esphomeyaml.sock;
|
||||||
|
}
|
||||||
|
map $http_upgrade $connection_upgrade {
|
||||||
|
default upgrade;
|
||||||
|
'' close;
|
||||||
|
}
|
||||||
|
|
||||||
|
server {
|
||||||
|
server_name hassio.local;
|
||||||
|
listen %%port%% default_server ssl;
|
||||||
|
root /dev/null;
|
||||||
|
|
||||||
|
ssl_certificate /ssl/%%certfile%%;
|
||||||
|
ssl_certificate_key /ssl/%%keyfile%%;
|
||||||
|
ssl_protocols TLSv1.2;
|
||||||
|
ssl_prefer_server_ciphers on;
|
||||||
|
ssl_ciphers ECDHE-RSA-AES256-GCM-SHA512:DHE-RSA-AES256-GCM-SHA512:ECDHE-RSA-AES256-GCM-SHA384:DHE-RSA-AES256-GCM-SHA384:ECDHE-RSA-AES256-SHA384:ECDHE-RSA-AES256-SHA:DHE-RSA-AES256-SHA;
|
||||||
|
ssl_ecdh_curve secp384r1;
|
||||||
|
ssl_session_timeout 10m;
|
||||||
|
ssl_session_cache shared:SSL:10m;
|
||||||
|
ssl_session_tickets off;
|
||||||
|
ssl_stapling on;
|
||||||
|
ssl_stapling_verify on;
|
||||||
|
|
||||||
|
# Redirect http requests to https on the same port.
|
||||||
|
# https://rageagainstshell.com/2016/11/redirect-http-to-https-on-the-same-port-in-nginx/
|
||||||
|
error_page 497 https://$http_host$request_uri;
|
||||||
|
|
||||||
|
location / {
|
||||||
|
proxy_redirect off;
|
||||||
|
proxy_pass http://esphomeyaml;
|
||||||
|
|
||||||
|
proxy_http_version 1.1;
|
||||||
|
proxy_set_header Upgrade $http_upgrade;
|
||||||
|
proxy_set_header Connection $connection_upgrade;
|
||||||
|
proxy_set_header Authorization "";
|
||||||
|
|
||||||
|
proxy_set_header X-Real-IP $remote_addr;
|
||||||
|
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
|
||||||
|
proxy_set_header X-Forwarded-Proto $scheme;
|
||||||
|
proxy_set_header Host $http_host;
|
||||||
|
proxy_set_header X-NginX-Proxy true;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
46
esphomeyaml-edge/rootfs/etc/nginx/nginx.conf
Executable file
|
@ -0,0 +1,46 @@
|
||||||
|
worker_processes 1;
|
||||||
|
pid /var/run/nginx.pid;
|
||||||
|
error_log stderr;
|
||||||
|
|
||||||
|
events {
|
||||||
|
worker_connections 1024;
|
||||||
|
}
|
||||||
|
|
||||||
|
http {
|
||||||
|
access_log stdout;
|
||||||
|
include mime.types;
|
||||||
|
default_type application/octet-stream;
|
||||||
|
sendfile on;
|
||||||
|
keepalive_timeout 65;
|
||||||
|
|
||||||
|
upstream esphomeyaml {
|
||||||
|
ip_hash;
|
||||||
|
server unix:/var/run/esphomeyaml.sock;
|
||||||
|
}
|
||||||
|
map $http_upgrade $connection_upgrade {
|
||||||
|
default upgrade;
|
||||||
|
'' close;
|
||||||
|
}
|
||||||
|
|
||||||
|
server {
|
||||||
|
server_name hassio.local;
|
||||||
|
listen %%port%% default_server;
|
||||||
|
root /dev/null;
|
||||||
|
|
||||||
|
location / {
|
||||||
|
proxy_redirect off;
|
||||||
|
proxy_pass http://esphomeyaml;
|
||||||
|
|
||||||
|
proxy_http_version 1.1;
|
||||||
|
proxy_set_header Upgrade $http_upgrade;
|
||||||
|
proxy_set_header Connection $connection_upgrade;
|
||||||
|
proxy_set_header Authorization "";
|
||||||
|
|
||||||
|
proxy_set_header X-Real-IP $remote_addr;
|
||||||
|
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
|
||||||
|
proxy_set_header X-Forwarded-Proto $scheme;
|
||||||
|
proxy_set_header Host $http_host;
|
||||||
|
proxy_set_header X-NginX-Proxy true;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
9
esphomeyaml-edge/rootfs/etc/services.d/esphomeyaml/finish
Executable file
|
@ -0,0 +1,9 @@
|
||||||
|
#!/usr/bin/execlineb -S0
|
||||||
|
# ==============================================================================
|
||||||
|
# Community Hass.io Add-ons: esphomeyaml
|
||||||
|
# Take down the S6 supervision tree when esphomeyaml fails
|
||||||
|
# ==============================================================================
|
||||||
|
if -n { s6-test $# -ne 0 }
|
||||||
|
if -n { s6-test ${1} -eq 256 }
|
||||||
|
|
||||||
|
s6-svscanctl -t /var/run/s6/services
|
14
esphomeyaml-edge/rootfs/etc/services.d/esphomeyaml/run
Executable file
|
@ -0,0 +1,14 @@
|
||||||
|
#!/usr/bin/with-contenv bash
|
||||||
|
# ==============================================================================
|
||||||
|
# Community Hass.io Add-ons: esphomeyaml
|
||||||
|
# Runs the esphomeyaml dashboard
|
||||||
|
# ==============================================================================
|
||||||
|
# shellcheck disable=SC1091
|
||||||
|
source /usr/lib/hassio-addons/base.sh
|
||||||
|
|
||||||
|
if hass.config.true 'leave_front_door_open'; then
|
||||||
|
export DISABLE_HA_AUTHENTICATION=true
|
||||||
|
fi
|
||||||
|
|
||||||
|
hass.log.info "Starting esphomeyaml dashboard..."
|
||||||
|
exec esphomeyaml /config/esphomeyaml dashboard --socket /var/run/esphomeyaml.sock --hassio
|
9
esphomeyaml-edge/rootfs/etc/services.d/nginx/finish
Executable file
|
@ -0,0 +1,9 @@
|
||||||
|
#!/usr/bin/execlineb -S0
|
||||||
|
# ==============================================================================
|
||||||
|
# Community Hass.io Add-ons: esphomeyaml
|
||||||
|
# Take down the S6 supervision tree when NGINX fails
|
||||||
|
# ==============================================================================
|
||||||
|
if -n { s6-test $# -ne 0 }
|
||||||
|
if -n { s6-test ${1} -eq 256 }
|
||||||
|
|
||||||
|
s6-svscanctl -t /var/run/s6/services
|
10
esphomeyaml-edge/rootfs/etc/services.d/nginx/run
Executable file
|
@ -0,0 +1,10 @@
|
||||||
|
#!/usr/bin/with-contenv bash
|
||||||
|
# ==============================================================================
|
||||||
|
# Community Hass.io Add-ons: esphomeyaml
|
||||||
|
# Runs the NGINX proxy
|
||||||
|
# ==============================================================================
|
||||||
|
# shellcheck disable=SC1091
|
||||||
|
source /usr/lib/hassio-addons/base.sh
|
||||||
|
|
||||||
|
hass.log.info "Starting NGINX..."
|
||||||
|
exec nginx -g "daemon off;"
|
12
esphomeyaml-edge/rootfs/opt/pio/platformio.ini
Normal file
|
@ -0,0 +1,12 @@
|
||||||
|
; This file allows the docker build file to install the required platformio
|
||||||
|
; platforms
|
||||||
|
|
||||||
|
[env:espressif8266]
|
||||||
|
platform = espressif8266
|
||||||
|
board = nodemcuv2
|
||||||
|
framework = arduino
|
||||||
|
|
||||||
|
[env:espressif32]
|
||||||
|
platform = espressif32
|
||||||
|
board = nodemcu-32s
|
||||||
|
framework = arduino
|
|
@ -2,25 +2,29 @@ from __future__ import print_function
|
||||||
|
|
||||||
import argparse
|
import argparse
|
||||||
from collections import OrderedDict
|
from collections import OrderedDict
|
||||||
|
from datetime import datetime
|
||||||
import logging
|
import logging
|
||||||
import os
|
import os
|
||||||
import random
|
import random
|
||||||
import sys
|
import sys
|
||||||
from datetime import datetime
|
|
||||||
|
|
||||||
from esphomeyaml import const, core, core_config, mqtt, wizard, writer, yaml_util, platformio_api
|
from esphomeyaml import const, core_config, mqtt, platformio_api, wizard, writer, yaml_util
|
||||||
from esphomeyaml.config import get_component, iter_components, read_config
|
from esphomeyaml.api.client import run_logs
|
||||||
from esphomeyaml.const import CONF_BAUD_RATE, CONF_BUILD_PATH, CONF_DOMAIN, CONF_ESPHOMEYAML, \
|
from esphomeyaml.config import get_component, iter_components, read_config, strip_default_ids
|
||||||
CONF_HOSTNAME, CONF_LOGGER, CONF_MANUAL_IP, CONF_NAME, CONF_STATIC_IP, CONF_USE_CUSTOM_CODE, \
|
from esphomeyaml.const import CONF_BAUD_RATE, CONF_ESPHOMEYAML, CONF_LOGGER, CONF_USE_CUSTOM_CODE, \
|
||||||
CONF_WIFI, ESP_PLATFORM_ESP8266
|
CONF_BROKER
|
||||||
from esphomeyaml.core import ESPHomeYAMLError
|
from esphomeyaml.core import CORE, EsphomeyamlError
|
||||||
from esphomeyaml.helpers import AssignmentExpression, Expression, RawStatement, \
|
from esphomeyaml.cpp_generator import Expression, RawStatement, add, statement
|
||||||
_EXPRESSIONS, add, add_job, color, flush_tasks, indent, statement, relative_path
|
from esphomeyaml.helpers import color, indent
|
||||||
from esphomeyaml.util import safe_print, run_external_command
|
from esphomeyaml.py_compat import safe_input, text_type, IS_PY2
|
||||||
|
from esphomeyaml.storage_json import StorageJSON, esphomeyaml_storage_path, \
|
||||||
|
start_update_check_thread, storage_path
|
||||||
|
from esphomeyaml.util import run_external_command, safe_print
|
||||||
|
|
||||||
_LOGGER = logging.getLogger(__name__)
|
_LOGGER = logging.getLogger(__name__)
|
||||||
|
|
||||||
PRE_INITIALIZE = ['esphomeyaml', 'logger', 'wifi', 'ota', 'mqtt', 'web_server', 'i2c']
|
PRE_INITIALIZE = ['esphomeyaml', 'logger', 'wifi', 'ethernet', 'ota', 'mqtt', 'web_server', 'api',
|
||||||
|
'i2c']
|
||||||
|
|
||||||
|
|
||||||
def get_serial_ports():
|
def get_serial_ports():
|
||||||
|
@ -32,37 +36,64 @@ def get_serial_ports():
|
||||||
continue
|
continue
|
||||||
if "VID:PID" in info:
|
if "VID:PID" in info:
|
||||||
result.append((port, desc))
|
result.append((port, desc))
|
||||||
|
result.sort(key=lambda x: x[0])
|
||||||
return result
|
return result
|
||||||
|
|
||||||
|
|
||||||
def choose_serial_port(config):
|
def choose_prompt(options):
|
||||||
result = get_serial_ports()
|
if not options:
|
||||||
|
raise ValueError
|
||||||
|
|
||||||
|
if len(options) == 1:
|
||||||
|
return options[0][1]
|
||||||
|
|
||||||
|
safe_print(u"Found multiple options, please choose one:")
|
||||||
|
for i, (desc, _) in enumerate(options):
|
||||||
|
safe_print(u" [{}] {}".format(i + 1, desc))
|
||||||
|
|
||||||
if not result:
|
|
||||||
return 'OTA'
|
|
||||||
safe_print(u"Found multiple serial port options, please choose one:")
|
|
||||||
for i, (res, desc) in enumerate(result):
|
|
||||||
safe_print(u" [{}] {} ({})".format(i, res, desc))
|
|
||||||
safe_print(u" [{}] Over The Air ({})".format(len(result), get_upload_host(config)))
|
|
||||||
safe_print()
|
|
||||||
while True:
|
while True:
|
||||||
opt = raw_input('(number): ')
|
opt = safe_input('(number): ')
|
||||||
if opt in result:
|
if opt in options:
|
||||||
opt = result.index(opt)
|
opt = options.index(opt)
|
||||||
break
|
break
|
||||||
try:
|
try:
|
||||||
opt = int(opt)
|
opt = int(opt)
|
||||||
if opt < 0 or opt > len(result):
|
if opt < 1 or opt > len(options):
|
||||||
raise ValueError
|
raise ValueError
|
||||||
break
|
break
|
||||||
except ValueError:
|
except ValueError:
|
||||||
safe_print(color('red', u"Invalid option: '{}'".format(opt)))
|
safe_print(color('red', u"Invalid option: '{}'".format(opt)))
|
||||||
if opt == len(result):
|
return options[opt - 1][1]
|
||||||
return 'OTA'
|
|
||||||
return result[opt][0]
|
|
||||||
|
|
||||||
|
|
||||||
def run_miniterm(config, port, escape=False):
|
def choose_upload_log_host(default, check_default, show_ota, show_mqtt, show_api):
|
||||||
|
options = []
|
||||||
|
for res, desc in get_serial_ports():
|
||||||
|
options.append((u"{} ({})".format(res, desc), res))
|
||||||
|
if (show_ota and 'ota' in CORE.config) or (show_api and 'api' in CORE.config):
|
||||||
|
options.append((u"Over The Air ({})".format(CORE.address), CORE.address))
|
||||||
|
if default == 'OTA':
|
||||||
|
return CORE.address
|
||||||
|
if show_mqtt and 'mqtt' in CORE.config:
|
||||||
|
options.append((u"MQTT ({})".format(CORE.config['mqtt'][CONF_BROKER]), 'MQTT'))
|
||||||
|
if default == 'OTA':
|
||||||
|
return 'MQTT'
|
||||||
|
if default is not None:
|
||||||
|
return default
|
||||||
|
if check_default is not None and check_default in [opt[1] for opt in options]:
|
||||||
|
return check_default
|
||||||
|
return choose_prompt(options)
|
||||||
|
|
||||||
|
|
||||||
|
def get_port_type(port):
|
||||||
|
if port.startswith('/') or port.startswith('COM'):
|
||||||
|
return 'SERIAL'
|
||||||
|
if port == 'MQTT':
|
||||||
|
return 'MQTT'
|
||||||
|
return 'NETWORK'
|
||||||
|
|
||||||
|
|
||||||
|
def run_miniterm(config, port):
|
||||||
import serial
|
import serial
|
||||||
if CONF_LOGGER not in config:
|
if CONF_LOGGER not in config:
|
||||||
_LOGGER.info("Logger is not enabled. Not starting UART logs.")
|
_LOGGER.info("Logger is not enabled. Not starting UART logs.")
|
||||||
|
@ -80,11 +111,13 @@ def run_miniterm(config, port, escape=False):
|
||||||
except serial.SerialException:
|
except serial.SerialException:
|
||||||
_LOGGER.error("Serial port closed!")
|
_LOGGER.error("Serial port closed!")
|
||||||
return
|
return
|
||||||
|
if IS_PY2:
|
||||||
line = raw.replace('\r', '').replace('\n', '')
|
line = raw.replace('\r', '').replace('\n', '')
|
||||||
|
else:
|
||||||
|
line = raw.replace(b'\r', b'').replace(b'\n', b'').decode('utf8',
|
||||||
|
'backslashreplace')
|
||||||
time = datetime.now().time().strftime('[%H:%M:%S]')
|
time = datetime.now().time().strftime('[%H:%M:%S]')
|
||||||
message = time + line
|
message = time + line
|
||||||
if escape:
|
|
||||||
message = message.replace('\033', '\\033')
|
|
||||||
safe_print(message)
|
safe_print(message)
|
||||||
|
|
||||||
backtrace_state = platformio_api.process_stacktrace(
|
backtrace_state = platformio_api.process_stacktrace(
|
||||||
|
@ -94,91 +127,65 @@ def run_miniterm(config, port, escape=False):
|
||||||
def write_cpp(config):
|
def write_cpp(config):
|
||||||
_LOGGER.info("Generating C++ source...")
|
_LOGGER.info("Generating C++ source...")
|
||||||
|
|
||||||
add_job(core_config.to_code, config[CONF_ESPHOMEYAML], domain='esphomeyaml')
|
CORE.add_job(core_config.to_code, config[CONF_ESPHOMEYAML], domain='esphomeyaml')
|
||||||
for domain in PRE_INITIALIZE:
|
for domain in PRE_INITIALIZE:
|
||||||
if domain == CONF_ESPHOMEYAML or domain not in config:
|
if domain == CONF_ESPHOMEYAML or domain not in config:
|
||||||
continue
|
continue
|
||||||
add_job(get_component(domain).to_code, config[domain], domain=domain)
|
CORE.add_job(get_component(domain).to_code, config[domain], domain=domain)
|
||||||
|
|
||||||
for domain, component, conf in iter_components(config):
|
for domain, component, conf in iter_components(config):
|
||||||
if domain in PRE_INITIALIZE or not hasattr(component, 'to_code'):
|
if domain in PRE_INITIALIZE or not hasattr(component, 'to_code'):
|
||||||
continue
|
continue
|
||||||
add_job(component.to_code, conf, domain=domain)
|
CORE.add_job(component.to_code, conf, domain=domain)
|
||||||
|
|
||||||
flush_tasks()
|
CORE.flush_tasks()
|
||||||
add(RawStatement(''))
|
add(RawStatement(''))
|
||||||
add(RawStatement(''))
|
add(RawStatement(''))
|
||||||
all_code = []
|
all_code = []
|
||||||
for exp in _EXPRESSIONS:
|
for exp in CORE.expressions:
|
||||||
if not config[CONF_ESPHOMEYAML][CONF_USE_CUSTOM_CODE]:
|
if not config[CONF_ESPHOMEYAML][CONF_USE_CUSTOM_CODE]:
|
||||||
if isinstance(exp, Expression) and not exp.required:
|
if isinstance(exp, Expression) and not exp.required:
|
||||||
continue
|
continue
|
||||||
if isinstance(exp, AssignmentExpression) and not exp.obj.required:
|
all_code.append(text_type(statement(exp)))
|
||||||
if not exp.has_side_effects():
|
|
||||||
continue
|
|
||||||
exp = exp.rhs
|
|
||||||
all_code.append(unicode(statement(exp)))
|
|
||||||
|
|
||||||
build_path = relative_path(config[CONF_ESPHOMEYAML][CONF_BUILD_PATH])
|
writer.write_platformio_project()
|
||||||
writer.write_platformio_project(config, build_path)
|
|
||||||
|
|
||||||
code_s = indent('\n'.join(line.rstrip() for line in all_code))
|
code_s = indent('\n'.join(line.rstrip() for line in all_code))
|
||||||
cpp_path = os.path.join(build_path, 'src', 'main.cpp')
|
writer.write_cpp(code_s)
|
||||||
writer.write_cpp(code_s, cpp_path)
|
|
||||||
return 0
|
return 0
|
||||||
|
|
||||||
|
|
||||||
def compile_program(args, config):
|
def compile_program(args, config):
|
||||||
_LOGGER.info("Compiling app...")
|
_LOGGER.info("Compiling app...")
|
||||||
return platformio_api.run_compile(config, args.verbose)
|
update_check = not os.getenv('ESPHOMEYAML_NO_UPDATE_CHECK', '')
|
||||||
|
if update_check:
|
||||||
|
thread = start_update_check_thread(esphomeyaml_storage_path(CORE.config_dir))
|
||||||
def get_upload_host(config):
|
rc = platformio_api.run_compile(config, args.verbose)
|
||||||
if CONF_MANUAL_IP in config[CONF_WIFI]:
|
if update_check:
|
||||||
host = str(config[CONF_WIFI][CONF_MANUAL_IP][CONF_STATIC_IP])
|
thread.join()
|
||||||
elif CONF_HOSTNAME in config[CONF_WIFI]:
|
return rc
|
||||||
host = config[CONF_WIFI][CONF_HOSTNAME] + config[CONF_WIFI][CONF_DOMAIN]
|
|
||||||
else:
|
|
||||||
host = config[CONF_ESPHOMEYAML][CONF_NAME] + config[CONF_WIFI][CONF_DOMAIN]
|
|
||||||
return host
|
|
||||||
|
|
||||||
|
|
||||||
def upload_using_esptool(config, port):
|
def upload_using_esptool(config, port):
|
||||||
import esptool
|
import esptool
|
||||||
|
|
||||||
build_path = relative_path(config[CONF_ESPHOMEYAML][CONF_BUILD_PATH])
|
path = os.path.join(CORE.build_path, '.pioenvs', CORE.name, 'firmware.bin')
|
||||||
path = os.path.join(build_path, '.pioenvs', core.NAME, 'firmware.bin')
|
|
||||||
cmd = ['esptool.py', '--before', 'default_reset', '--after', 'hard_reset',
|
cmd = ['esptool.py', '--before', 'default_reset', '--after', 'hard_reset',
|
||||||
'--chip', 'esp8266', '--port', port, 'write_flash', '0x0', path]
|
'--chip', 'esp8266', '--port', port, 'write_flash', '0x0', path]
|
||||||
# pylint: disable=protected-access
|
# pylint: disable=protected-access
|
||||||
return run_external_command(esptool._main, *cmd)
|
return run_external_command(esptool._main, *cmd)
|
||||||
|
|
||||||
|
|
||||||
def upload_program(config, args, port):
|
def upload_program(config, args, host):
|
||||||
build_path = relative_path(config[CONF_ESPHOMEYAML][CONF_BUILD_PATH])
|
|
||||||
|
|
||||||
# if upload is to a serial port use platformio, otherwise assume ota
|
# if upload is to a serial port use platformio, otherwise assume ota
|
||||||
serial_port = port.startswith('/') or port.startswith('COM')
|
if get_port_type(host) == 'SERIAL':
|
||||||
if port != 'OTA' and serial_port:
|
if CORE.is_esp8266:
|
||||||
if core.ESP_PLATFORM == ESP_PLATFORM_ESP8266 and args.use_esptoolpy:
|
return upload_using_esptool(config, host)
|
||||||
return upload_using_esptool(config, port)
|
return platformio_api.run_upload(config, args.verbose, host)
|
||||||
return platformio_api.run_upload(config, args.verbose, port)
|
|
||||||
|
|
||||||
if 'ota' not in config:
|
|
||||||
_LOGGER.error("No serial port found and OTA not enabled. Can't upload!")
|
|
||||||
return -1
|
|
||||||
|
|
||||||
# If hostname/ip is explicitly provided as upload-port argument, use this instead of zeroconf
|
|
||||||
# hostname. This is to support use cases where zeroconf (hostname.local) does not work.
|
|
||||||
if port != 'OTA':
|
|
||||||
host = port
|
|
||||||
else:
|
|
||||||
host = get_upload_host(config)
|
|
||||||
|
|
||||||
from esphomeyaml.components import ota
|
from esphomeyaml.components import ota
|
||||||
from esphomeyaml import espota2
|
from esphomeyaml import espota2
|
||||||
|
|
||||||
bin_file = os.path.join(build_path, '.pioenvs', core.NAME, 'firmware.bin')
|
|
||||||
if args.host_port is not None:
|
if args.host_port is not None:
|
||||||
host_port = args.host_port
|
host_port = args.host_port
|
||||||
else:
|
else:
|
||||||
|
@ -188,20 +195,31 @@ def upload_program(config, args, port):
|
||||||
remote_port = ota.get_port(config)
|
remote_port = ota.get_port(config)
|
||||||
password = ota.get_auth(config)
|
password = ota.get_auth(config)
|
||||||
|
|
||||||
res = espota2.run_ota(host, remote_port, password, bin_file)
|
storage = StorageJSON.load(storage_path())
|
||||||
|
res = espota2.run_ota(host, remote_port, password, CORE.firmware_bin)
|
||||||
if res == 0:
|
if res == 0:
|
||||||
|
if storage is not None and storage.use_legacy_ota:
|
||||||
|
storage.use_legacy_ota = False
|
||||||
|
storage.save(storage_path())
|
||||||
return res
|
return res
|
||||||
_LOGGER.warn("OTA v2 method failed. Trying with legacy OTA...")
|
if storage is not None and not storage.use_legacy_ota:
|
||||||
return espota2.run_legacy_ota(verbose, host_port, host, remote_port, password, bin_file)
|
return res
|
||||||
|
|
||||||
|
_LOGGER.warning("OTA v2 method failed. Trying with legacy OTA...")
|
||||||
|
return espota2.run_legacy_ota(verbose, host_port, host, remote_port, password,
|
||||||
|
CORE.firmware_bin)
|
||||||
|
|
||||||
|
|
||||||
def show_logs(config, args, port, escape=False):
|
def show_logs(config, args, port):
|
||||||
serial_port = port.startswith('/') or port.startswith('COM')
|
if get_port_type(port) == 'SERIAL':
|
||||||
if port != 'OTA' and serial_port:
|
run_miniterm(config, port)
|
||||||
run_miniterm(config, port, escape=escape)
|
|
||||||
return 0
|
return 0
|
||||||
return mqtt.show_logs(config, args.topic, args.username, args.password, args.client_id,
|
if get_port_type(port) == 'NETWORK':
|
||||||
escape=escape)
|
return run_logs(config, port)
|
||||||
|
if get_port_type(port) == 'MQTT':
|
||||||
|
return mqtt.show_logs(config, args.topic, args.username, args.password, args.client_id)
|
||||||
|
|
||||||
|
raise ValueError
|
||||||
|
|
||||||
|
|
||||||
def clean_mqtt(config, args):
|
def clean_mqtt(config, args):
|
||||||
|
@ -239,26 +257,8 @@ def command_wizard(args):
|
||||||
return wizard.wizard(args.configuration)
|
return wizard.wizard(args.configuration)
|
||||||
|
|
||||||
|
|
||||||
def strip_default_ids(config):
|
|
||||||
value = config
|
|
||||||
if isinstance(config, list):
|
|
||||||
value = type(config)()
|
|
||||||
for x in config:
|
|
||||||
if isinstance(x, core.ID) and not x.is_manual:
|
|
||||||
continue
|
|
||||||
value.append(strip_default_ids(x))
|
|
||||||
return value
|
|
||||||
elif isinstance(config, dict):
|
|
||||||
value = type(config)()
|
|
||||||
for k, v in config.iteritems():
|
|
||||||
if isinstance(v, core.ID) and not v.is_manual:
|
|
||||||
continue
|
|
||||||
value[k] = strip_default_ids(v)
|
|
||||||
return value
|
|
||||||
return value
|
|
||||||
|
|
||||||
|
|
||||||
def command_config(args, config):
|
def command_config(args, config):
|
||||||
|
_LOGGER.info("Configuration is valid!")
|
||||||
if not args.verbose:
|
if not args.verbose:
|
||||||
config = strip_default_ids(config)
|
config = strip_default_ids(config)
|
||||||
safe_print(yaml_util.dump(config))
|
safe_print(yaml_util.dump(config))
|
||||||
|
@ -280,7 +280,8 @@ def command_compile(args, config):
|
||||||
|
|
||||||
|
|
||||||
def command_upload(args, config):
|
def command_upload(args, config):
|
||||||
port = args.upload_port or choose_serial_port(config)
|
port = choose_upload_log_host(default=args.upload_port, check_default=None,
|
||||||
|
show_ota=True, show_mqtt=False, show_api=False)
|
||||||
exit_code = upload_program(config, args, port)
|
exit_code = upload_program(config, args, port)
|
||||||
if exit_code != 0:
|
if exit_code != 0:
|
||||||
return exit_code
|
return exit_code
|
||||||
|
@ -289,8 +290,9 @@ def command_upload(args, config):
|
||||||
|
|
||||||
|
|
||||||
def command_logs(args, config):
|
def command_logs(args, config):
|
||||||
port = args.serial_port or choose_serial_port(config)
|
port = choose_upload_log_host(default=args.serial_port, check_default=None,
|
||||||
return show_logs(config, args, port, escape=args.escape)
|
show_ota=False, show_mqtt=True, show_api=True)
|
||||||
|
return show_logs(config, args, port)
|
||||||
|
|
||||||
|
|
||||||
def command_run(args, config):
|
def command_run(args, config):
|
||||||
|
@ -301,14 +303,17 @@ def command_run(args, config):
|
||||||
if exit_code != 0:
|
if exit_code != 0:
|
||||||
return exit_code
|
return exit_code
|
||||||
_LOGGER.info(u"Successfully compiled program.")
|
_LOGGER.info(u"Successfully compiled program.")
|
||||||
port = args.upload_port or choose_serial_port(config)
|
port = choose_upload_log_host(default=args.upload_port, check_default=None,
|
||||||
|
show_ota=True, show_mqtt=False, show_api=True)
|
||||||
exit_code = upload_program(config, args, port)
|
exit_code = upload_program(config, args, port)
|
||||||
if exit_code != 0:
|
if exit_code != 0:
|
||||||
return exit_code
|
return exit_code
|
||||||
_LOGGER.info(u"Successfully uploaded program.")
|
_LOGGER.info(u"Successfully uploaded program.")
|
||||||
if args.no_logs:
|
if args.no_logs:
|
||||||
return 0
|
return 0
|
||||||
return show_logs(config, args, port, escape=args.escape)
|
port = choose_upload_log_host(default=args.upload_port, check_default=port,
|
||||||
|
show_ota=False, show_mqtt=True, show_api=True)
|
||||||
|
return show_logs(config, args, port)
|
||||||
|
|
||||||
|
|
||||||
def command_clean_mqtt(args, config):
|
def command_clean_mqtt(args, config):
|
||||||
|
@ -325,9 +330,8 @@ def command_version(args):
|
||||||
|
|
||||||
|
|
||||||
def command_clean(args, config):
|
def command_clean(args, config):
|
||||||
build_path = relative_path(config[CONF_ESPHOMEYAML][CONF_BUILD_PATH])
|
|
||||||
try:
|
try:
|
||||||
writer.clean_build(build_path)
|
writer.clean_build()
|
||||||
except OSError as err:
|
except OSError as err:
|
||||||
_LOGGER.error("Error deleting build files: %s", err)
|
_LOGGER.error("Error deleting build files: %s", err)
|
||||||
return 1
|
return 1
|
||||||
|
@ -386,6 +390,8 @@ def parse_args(argv):
|
||||||
parser = argparse.ArgumentParser(prog='esphomeyaml')
|
parser = argparse.ArgumentParser(prog='esphomeyaml')
|
||||||
parser.add_argument('-v', '--verbose', help="Enable verbose esphomeyaml logs.",
|
parser.add_argument('-v', '--verbose', help="Enable verbose esphomeyaml logs.",
|
||||||
action='store_true')
|
action='store_true')
|
||||||
|
parser.add_argument('--dashboard', help="Internal flag to set if the command is run from the "
|
||||||
|
"dashboard.", action='store_true')
|
||||||
parser.add_argument('configuration', help='Your YAML configuration file.')
|
parser.add_argument('configuration', help='Your YAML configuration file.')
|
||||||
|
|
||||||
subparsers = parser.add_subparsers(help='Commands', dest='command')
|
subparsers = parser.add_subparsers(help='Commands', dest='command')
|
||||||
|
@ -403,9 +409,6 @@ def parse_args(argv):
|
||||||
parser_upload.add_argument('--upload-port', help="Manually specify the upload port to use. "
|
parser_upload.add_argument('--upload-port', help="Manually specify the upload port to use. "
|
||||||
"For example /dev/cu.SLAB_USBtoUART.")
|
"For example /dev/cu.SLAB_USBtoUART.")
|
||||||
parser_upload.add_argument('--host-port', help="Specify the host port.", type=int)
|
parser_upload.add_argument('--host-port', help="Specify the host port.", type=int)
|
||||||
parser_upload.add_argument('--use-esptoolpy',
|
|
||||||
help="Use esptool.py for the uploading (only for ESP8266)",
|
|
||||||
action='store_true')
|
|
||||||
|
|
||||||
parser_logs = subparsers.add_parser('logs', help='Validate the configuration '
|
parser_logs = subparsers.add_parser('logs', help='Validate the configuration '
|
||||||
'and show all MQTT logs.')
|
'and show all MQTT logs.')
|
||||||
|
@ -415,8 +418,6 @@ def parse_args(argv):
|
||||||
parser_logs.add_argument('--client-id', help='Manually set the client id.')
|
parser_logs.add_argument('--client-id', help='Manually set the client id.')
|
||||||
parser_logs.add_argument('--serial-port', help="Manually specify a serial port to use"
|
parser_logs.add_argument('--serial-port', help="Manually specify a serial port to use"
|
||||||
"For example /dev/cu.SLAB_USBtoUART.")
|
"For example /dev/cu.SLAB_USBtoUART.")
|
||||||
parser_logs.add_argument('--escape', help="Escape ANSI color codes for running in dashboard",
|
|
||||||
action='store_true')
|
|
||||||
|
|
||||||
parser_run = subparsers.add_parser('run', help='Validate the configuration, create a binary, '
|
parser_run = subparsers.add_parser('run', help='Validate the configuration, create a binary, '
|
||||||
'upload it, and start MQTT logs.')
|
'upload it, and start MQTT logs.')
|
||||||
|
@ -429,11 +430,6 @@ def parse_args(argv):
|
||||||
parser_run.add_argument('--username', help='Manually set the MQTT username for logs.')
|
parser_run.add_argument('--username', help='Manually set the MQTT username for logs.')
|
||||||
parser_run.add_argument('--password', help='Manually set the MQTT password for logs.')
|
parser_run.add_argument('--password', help='Manually set the MQTT password for logs.')
|
||||||
parser_run.add_argument('--client-id', help='Manually set the client id for logs.')
|
parser_run.add_argument('--client-id', help='Manually set the client id for logs.')
|
||||||
parser_run.add_argument('--escape', help="Escape ANSI color codes for running in dashboard",
|
|
||||||
action='store_true')
|
|
||||||
parser_run.add_argument('--use-esptoolpy',
|
|
||||||
help="Use esptool.py for the uploading (only for ESP8266)",
|
|
||||||
action='store_true')
|
|
||||||
|
|
||||||
parser_clean = subparsers.add_parser('clean-mqtt', help="Helper to clear an MQTT topic from "
|
parser_clean = subparsers.add_parser('clean-mqtt', help="Helper to clear an MQTT topic from "
|
||||||
"retain messages.")
|
"retain messages.")
|
||||||
|
@ -453,14 +449,21 @@ def parse_args(argv):
|
||||||
|
|
||||||
dashboard = subparsers.add_parser('dashboard',
|
dashboard = subparsers.add_parser('dashboard',
|
||||||
help="Create a simple web server for a dashboard.")
|
help="Create a simple web server for a dashboard.")
|
||||||
dashboard.add_argument("--port", help="The HTTP port to open connections on.", type=int,
|
dashboard.add_argument("--port", help="The HTTP port to open connections on. Defaults to 6052.",
|
||||||
default=6052)
|
type=int, default=6052)
|
||||||
dashboard.add_argument("--password", help="The optional password to require for all requests.",
|
dashboard.add_argument("--password", help="The optional password to require for all requests.",
|
||||||
type=str, default='')
|
type=str, default='')
|
||||||
dashboard.add_argument("--open-ui", help="Open the dashboard UI in a browser.",
|
dashboard.add_argument("--open-ui", help="Open the dashboard UI in a browser.",
|
||||||
action='store_true')
|
action='store_true')
|
||||||
|
dashboard.add_argument("--hassio",
|
||||||
|
help="Internal flag used to tell esphomeyaml is started as a Hass.io "
|
||||||
|
"add-on.",
|
||||||
|
action="store_true")
|
||||||
|
dashboard.add_argument("--socket",
|
||||||
|
help="Make the dashboard serve under a unix socket", type=str)
|
||||||
|
|
||||||
subparsers.add_parser('hass-config', help="Dump the configuration entries that should be added"
|
subparsers.add_parser('hass-config',
|
||||||
|
help="Dump the configuration entries that should be added "
|
||||||
"to Home Assistant when not using MQTT discovery.")
|
"to Home Assistant when not using MQTT discovery.")
|
||||||
|
|
||||||
return parser.parse_args(argv[1:])
|
return parser.parse_args(argv[1:])
|
||||||
|
@ -468,24 +471,27 @@ def parse_args(argv):
|
||||||
|
|
||||||
def run_esphomeyaml(argv):
|
def run_esphomeyaml(argv):
|
||||||
args = parse_args(argv)
|
args = parse_args(argv)
|
||||||
|
CORE.dashboard = args.dashboard
|
||||||
|
|
||||||
setup_log(args.verbose)
|
setup_log(args.verbose)
|
||||||
if args.command in PRE_CONFIG_ACTIONS:
|
if args.command in PRE_CONFIG_ACTIONS:
|
||||||
try:
|
try:
|
||||||
return PRE_CONFIG_ACTIONS[args.command](args)
|
return PRE_CONFIG_ACTIONS[args.command](args)
|
||||||
except ESPHomeYAMLError as e:
|
except EsphomeyamlError as e:
|
||||||
_LOGGER.error(e)
|
_LOGGER.error(e)
|
||||||
return 1
|
return 1
|
||||||
|
|
||||||
core.CONFIG_PATH = args.configuration
|
CORE.config_path = args.configuration
|
||||||
|
|
||||||
config = read_config(core.CONFIG_PATH)
|
config = read_config(args.verbose)
|
||||||
if config is None:
|
if config is None:
|
||||||
return 1
|
return 1
|
||||||
|
CORE.config = config
|
||||||
|
|
||||||
if args.command in POST_CONFIG_ACTIONS:
|
if args.command in POST_CONFIG_ACTIONS:
|
||||||
try:
|
try:
|
||||||
return POST_CONFIG_ACTIONS[args.command](args, config)
|
return POST_CONFIG_ACTIONS[args.command](args, config)
|
||||||
except ESPHomeYAMLError as e:
|
except EsphomeyamlError as e:
|
||||||
_LOGGER.error(e)
|
_LOGGER.error(e)
|
||||||
return 1
|
return 1
|
||||||
safe_print(u"Unknown command {}".format(args.command))
|
safe_print(u"Unknown command {}".format(args.command))
|
||||||
|
@ -495,7 +501,7 @@ def run_esphomeyaml(argv):
|
||||||
def main():
|
def main():
|
||||||
try:
|
try:
|
||||||
return run_esphomeyaml(sys.argv)
|
return run_esphomeyaml(sys.argv)
|
||||||
except ESPHomeYAMLError as e:
|
except EsphomeyamlError as e:
|
||||||
_LOGGER.error(e)
|
_LOGGER.error(e)
|
||||||
return 1
|
return 1
|
||||||
except KeyboardInterrupt:
|
except KeyboardInterrupt:
|
||||||
|
|
0
esphomeyaml/api/__init__.py
Normal file
330
esphomeyaml/api/api.proto
Normal file
|
@ -0,0 +1,330 @@
|
||||||
|
syntax = "proto3";
|
||||||
|
|
||||||
|
// The Home Assistant protocol is structured as a simple
|
||||||
|
// TCP socket with short binary messages encoded in the protocol buffers format
|
||||||
|
// First, a message in this protocol has a specific format:
|
||||||
|
// * VarInt denoting the size of the message object. (type is not part of this)
|
||||||
|
// * VarInt denoting the type of message.
|
||||||
|
// * The message object encoded as a ProtoBuf message
|
||||||
|
|
||||||
|
// The connection is established in 4 steps:
|
||||||
|
// * First, the client connects to the server and sends a "Hello Request" identifying itself
|
||||||
|
// * The server responds with a "Hello Response" and selects the protocol version
|
||||||
|
// * After receiving this message, the client attempts to authenticate itself using
|
||||||
|
// the password and a "Connect Request"
|
||||||
|
// * The server responds with a "Connect Response" and notifies of invalid password.
|
||||||
|
// If anything in this initial process fails, the connection must immediately closed
|
||||||
|
// by both sides and _no_ disconnection message is to be sent.
|
||||||
|
|
||||||
|
// Message sent at the beginning of each connection
|
||||||
|
// Can only be sent by the client and only at the beginning of the connection
|
||||||
|
message HelloRequest {
|
||||||
|
// Description of client (like User Agent)
|
||||||
|
// For example "Home Assistant"
|
||||||
|
// Not strictly necessary to send but nice for debugging
|
||||||
|
// purposes.
|
||||||
|
string client_info = 1;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Confirmation of successful connection request.
|
||||||
|
// Can only be sent by the server and only at the beginning of the connection
|
||||||
|
message HelloResponse {
|
||||||
|
// The version of the API to use. The _client_ (for example Home Assistant) needs to check
|
||||||
|
// for compatibility and if necessary adopt to an older API.
|
||||||
|
// Major is for breaking changes in the base protocol - a mismatch will lead to immediate disconnect_client_
|
||||||
|
// Minor is for breaking changes in individual messages - a mismatch will lead to a warning message
|
||||||
|
uint32 api_version_major = 1;
|
||||||
|
uint32 api_version_minor = 2;
|
||||||
|
|
||||||
|
// A string identifying the server (ESP); like client info this may be empty
|
||||||
|
// and only exists for debugging/logging purposes.
|
||||||
|
// For example "ESPHome v1.10.0 on ESP8266"
|
||||||
|
string server_info = 3;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Message sent at the beginning of each connection to authenticate the client
|
||||||
|
// Can only be sent by the client and only at the beginning of the connection
|
||||||
|
message ConnectRequest {
|
||||||
|
// The password to log in with
|
||||||
|
string password = 1;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Confirmation of successful connection. After this the connection is available for all traffic.
|
||||||
|
// Can only be sent by the server and only at the beginning of the connection
|
||||||
|
message ConnectResponse {
|
||||||
|
bool invalid_password = 1;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Request to close the connection.
|
||||||
|
// Can be sent by both the client and server
|
||||||
|
message DisconnectRequest {
|
||||||
|
// Do not close the connection before the acknowledgement arrives
|
||||||
|
}
|
||||||
|
|
||||||
|
message DisconnectResponse {
|
||||||
|
// Empty - Both parties are required to close the connection after this
|
||||||
|
// message has been received.
|
||||||
|
}
|
||||||
|
|
||||||
|
message PingRequest {
|
||||||
|
// Empty
|
||||||
|
}
|
||||||
|
|
||||||
|
message PingResponse {
|
||||||
|
// Empty
|
||||||
|
}
|
||||||
|
|
||||||
|
message DeviceInfoRequest {
|
||||||
|
// Empty
|
||||||
|
}
|
||||||
|
|
||||||
|
message DeviceInfoResponse {
|
||||||
|
bool uses_password = 1;
|
||||||
|
|
||||||
|
// The name of the node, given by "App.set_name()"
|
||||||
|
string name = 2;
|
||||||
|
|
||||||
|
// The mac address of the device. For example "AC:BC:32:89:0E:A9"
|
||||||
|
string mac_address = 3;
|
||||||
|
|
||||||
|
// A string describing the ESPHome version. For example "1.10.0"
|
||||||
|
string esphome_core_version = 4;
|
||||||
|
|
||||||
|
// A string describing the date of compilation, this is generated by the compiler
|
||||||
|
// and therefore may not be in the same format all the time.
|
||||||
|
// If the user isn't using esphomeyaml, this will also not be set.
|
||||||
|
string compilation_time = 5;
|
||||||
|
|
||||||
|
// The model of the board. For example NodeMCU
|
||||||
|
string model = 6;
|
||||||
|
|
||||||
|
bool has_deep_sleep = 7;
|
||||||
|
}
|
||||||
|
|
||||||
|
message ListEntitiesRequest {
|
||||||
|
// Empty
|
||||||
|
}
|
||||||
|
|
||||||
|
message ListEntitiesBinarySensorResponse {
|
||||||
|
string object_id = 1;
|
||||||
|
fixed32 key = 2;
|
||||||
|
string name = 3;
|
||||||
|
string unique_id = 4;
|
||||||
|
|
||||||
|
string device_class = 5;
|
||||||
|
bool is_status_binary_sensor = 6;
|
||||||
|
}
|
||||||
|
message ListEntitiesCoverResponse {
|
||||||
|
string object_id = 1;
|
||||||
|
fixed32 key = 2;
|
||||||
|
string name = 3;
|
||||||
|
string unique_id = 4;
|
||||||
|
|
||||||
|
bool is_optimistic = 5;
|
||||||
|
}
|
||||||
|
message ListEntitiesFanResponse {
|
||||||
|
string object_id = 1;
|
||||||
|
fixed32 key = 2;
|
||||||
|
string name = 3;
|
||||||
|
string unique_id = 4;
|
||||||
|
|
||||||
|
bool supports_oscillation = 5;
|
||||||
|
bool supports_speed = 6;
|
||||||
|
}
|
||||||
|
message ListEntitiesLightResponse {
|
||||||
|
string object_id = 1;
|
||||||
|
fixed32 key = 2;
|
||||||
|
string name = 3;
|
||||||
|
string unique_id = 4;
|
||||||
|
|
||||||
|
bool supports_brightness = 5;
|
||||||
|
bool supports_rgb = 6;
|
||||||
|
bool supports_white_value = 7;
|
||||||
|
bool supports_color_temperature = 8;
|
||||||
|
float min_mireds = 9;
|
||||||
|
float max_mireds = 10;
|
||||||
|
repeated string effects = 11;
|
||||||
|
}
|
||||||
|
message ListEntitiesSensorResponse {
|
||||||
|
string object_id = 1;
|
||||||
|
fixed32 key = 2;
|
||||||
|
string name = 3;
|
||||||
|
string unique_id = 4;
|
||||||
|
|
||||||
|
string icon = 5;
|
||||||
|
string unit_of_measurement = 6;
|
||||||
|
int32 accuracy_decimals = 7;
|
||||||
|
}
|
||||||
|
message ListEntitiesSwitchResponse {
|
||||||
|
string object_id = 1;
|
||||||
|
fixed32 key = 2;
|
||||||
|
string name = 3;
|
||||||
|
string unique_id = 4;
|
||||||
|
|
||||||
|
string icon = 5;
|
||||||
|
bool optimistic = 6;
|
||||||
|
}
|
||||||
|
message ListEntitiesTextSensorResponse {
|
||||||
|
string object_id = 1;
|
||||||
|
fixed32 key = 2;
|
||||||
|
string name = 3;
|
||||||
|
string unique_id = 4;
|
||||||
|
|
||||||
|
string icon = 5;
|
||||||
|
}
|
||||||
|
message ListEntitiesDoneResponse {
|
||||||
|
// Empty
|
||||||
|
}
|
||||||
|
|
||||||
|
message SubscribeStatesRequest {
|
||||||
|
// Empty
|
||||||
|
}
|
||||||
|
message BinarySensorStateResponse {
|
||||||
|
fixed32 key = 1;
|
||||||
|
bool state = 2;
|
||||||
|
}
|
||||||
|
message CoverStateResponse {
|
||||||
|
fixed32 key = 1;
|
||||||
|
enum CoverState {
|
||||||
|
OPEN = 0;
|
||||||
|
CLOSED = 1;
|
||||||
|
}
|
||||||
|
CoverState state = 2;
|
||||||
|
}
|
||||||
|
enum FanSpeed {
|
||||||
|
LOW = 0;
|
||||||
|
MEDIUM = 1;
|
||||||
|
HIGH = 2;
|
||||||
|
}
|
||||||
|
message FanStateResponse {
|
||||||
|
fixed32 key = 1;
|
||||||
|
bool state = 2;
|
||||||
|
bool oscillating = 3;
|
||||||
|
FanSpeed speed = 4;
|
||||||
|
}
|
||||||
|
message LightStateResponse {
|
||||||
|
fixed32 key = 1;
|
||||||
|
bool state = 2;
|
||||||
|
float brightness = 3;
|
||||||
|
float red = 4;
|
||||||
|
float green = 5;
|
||||||
|
float blue = 6;
|
||||||
|
float white = 7;
|
||||||
|
float color_temperature = 8;
|
||||||
|
string effect = 9;
|
||||||
|
}
|
||||||
|
message SensorStateResponse {
|
||||||
|
fixed32 key = 1;
|
||||||
|
float state = 2;
|
||||||
|
}
|
||||||
|
message SwitchStateResponse {
|
||||||
|
fixed32 key = 1;
|
||||||
|
bool state = 2;
|
||||||
|
}
|
||||||
|
message TextSensorStateResponse {
|
||||||
|
fixed32 key = 1;
|
||||||
|
string state = 2;
|
||||||
|
}
|
||||||
|
|
||||||
|
message CoverCommandRequest {
|
||||||
|
fixed32 key = 1;
|
||||||
|
enum CoverCommand {
|
||||||
|
OPEN = 0;
|
||||||
|
CLOSE = 1;
|
||||||
|
STOP = 2;
|
||||||
|
}
|
||||||
|
bool has_state = 2;
|
||||||
|
CoverCommand command = 3;
|
||||||
|
}
|
||||||
|
message FanCommandRequest {
|
||||||
|
fixed32 key = 1;
|
||||||
|
bool has_state = 2;
|
||||||
|
bool state = 3;
|
||||||
|
bool has_speed = 4;
|
||||||
|
FanSpeed speed = 5;
|
||||||
|
bool has_oscillating = 6;
|
||||||
|
bool oscillating = 7;
|
||||||
|
}
|
||||||
|
message LightCommandRequest {
|
||||||
|
fixed32 key = 1;
|
||||||
|
bool has_state = 2;
|
||||||
|
bool state = 3;
|
||||||
|
bool has_brightness = 4;
|
||||||
|
float brightness = 5;
|
||||||
|
bool has_rgb = 6;
|
||||||
|
float red = 7;
|
||||||
|
float green = 8;
|
||||||
|
float blue = 9;
|
||||||
|
bool has_white = 10;
|
||||||
|
float white = 11;
|
||||||
|
bool has_color_temperature = 12;
|
||||||
|
float color_temperature = 13;
|
||||||
|
bool has_transition_length = 14;
|
||||||
|
uint32 transition_length = 15;
|
||||||
|
bool has_flash_length = 16;
|
||||||
|
uint32 flash_length = 17;
|
||||||
|
bool has_effect = 18;
|
||||||
|
string effect = 19;
|
||||||
|
}
|
||||||
|
message SwitchCommandRequest {
|
||||||
|
fixed32 key = 1;
|
||||||
|
bool state = 2;
|
||||||
|
}
|
||||||
|
|
||||||
|
enum LogLevel {
|
||||||
|
NONE = 0;
|
||||||
|
ERROR = 1;
|
||||||
|
WARN = 2;
|
||||||
|
INFO = 3;
|
||||||
|
DEBUG = 4;
|
||||||
|
VERBOSE = 5;
|
||||||
|
VERY_VERBOSE = 6;
|
||||||
|
}
|
||||||
|
|
||||||
|
message SubscribeLogsRequest {
|
||||||
|
LogLevel level = 1;
|
||||||
|
bool dump_config = 2;
|
||||||
|
}
|
||||||
|
|
||||||
|
message SubscribeLogsResponse {
|
||||||
|
LogLevel level = 1;
|
||||||
|
string tag = 2;
|
||||||
|
string message = 3;
|
||||||
|
bool send_failed = 4;
|
||||||
|
}
|
||||||
|
|
||||||
|
message SubscribeServiceCallsRequest {
|
||||||
|
|
||||||
|
}
|
||||||
|
|
||||||
|
message ServiceCallResponse {
|
||||||
|
string service = 1;
|
||||||
|
map<string, string> data = 2;
|
||||||
|
map<string, string> data_template = 3;
|
||||||
|
map<string, string> variables = 4;
|
||||||
|
}
|
||||||
|
|
||||||
|
// 1. Client sends SubscribeHomeAssistantStatesRequest
|
||||||
|
// 2. Server responds with zero or more SubscribeHomeAssistantStateResponse (async)
|
||||||
|
// 3. Client sends HomeAssistantStateResponse for state changes.
|
||||||
|
message SubscribeHomeAssistantStatesRequest {
|
||||||
|
|
||||||
|
}
|
||||||
|
|
||||||
|
message SubscribeHomeAssistantStateResponse {
|
||||||
|
string entity_id = 1;
|
||||||
|
}
|
||||||
|
|
||||||
|
message HomeAssistantStateResponse {
|
||||||
|
string entity_id = 1;
|
||||||
|
string state = 2;
|
||||||
|
}
|
||||||
|
|
||||||
|
message GetTimeRequest {
|
||||||
|
|
||||||
|
}
|
||||||
|
|
||||||
|
message GetTimeResponse {
|
||||||
|
fixed32 epoch_seconds = 1;
|
||||||
|
}
|
||||||
|
|
2484
esphomeyaml/api/api_pb2.py
Normal file
490
esphomeyaml/api/client.py
Normal file
|
@ -0,0 +1,490 @@
|
||||||
|
from datetime import datetime
|
||||||
|
import functools
|
||||||
|
import logging
|
||||||
|
import socket
|
||||||
|
import threading
|
||||||
|
import time
|
||||||
|
|
||||||
|
# pylint: disable=unused-import
|
||||||
|
from typing import Optional # noqa
|
||||||
|
from google.protobuf import message # noqa
|
||||||
|
|
||||||
|
from esphomeyaml import const
|
||||||
|
import esphomeyaml.api.api_pb2 as pb
|
||||||
|
from esphomeyaml.const import CONF_PASSWORD, CONF_PORT
|
||||||
|
from esphomeyaml.core import EsphomeyamlError
|
||||||
|
from esphomeyaml.helpers import resolve_ip_address, indent, color
|
||||||
|
from esphomeyaml.py_compat import text_type, IS_PY2, byte_to_bytes, char_to_byte, format_bytes
|
||||||
|
from esphomeyaml.util import safe_print
|
||||||
|
|
||||||
|
_LOGGER = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
|
||||||
|
class APIConnectionError(EsphomeyamlError):
|
||||||
|
pass
|
||||||
|
|
||||||
|
|
||||||
|
MESSAGE_TYPE_TO_PROTO = {
|
||||||
|
1: pb.HelloRequest,
|
||||||
|
2: pb.HelloResponse,
|
||||||
|
3: pb.ConnectRequest,
|
||||||
|
4: pb.ConnectResponse,
|
||||||
|
5: pb.DisconnectRequest,
|
||||||
|
6: pb.DisconnectResponse,
|
||||||
|
7: pb.PingRequest,
|
||||||
|
8: pb.PingResponse,
|
||||||
|
9: pb.DeviceInfoRequest,
|
||||||
|
10: pb.DeviceInfoResponse,
|
||||||
|
11: pb.ListEntitiesRequest,
|
||||||
|
12: pb.ListEntitiesBinarySensorResponse,
|
||||||
|
13: pb.ListEntitiesCoverResponse,
|
||||||
|
14: pb.ListEntitiesFanResponse,
|
||||||
|
15: pb.ListEntitiesLightResponse,
|
||||||
|
16: pb.ListEntitiesSensorResponse,
|
||||||
|
17: pb.ListEntitiesSwitchResponse,
|
||||||
|
18: pb.ListEntitiesTextSensorResponse,
|
||||||
|
19: pb.ListEntitiesDoneResponse,
|
||||||
|
20: pb.SubscribeStatesRequest,
|
||||||
|
21: pb.BinarySensorStateResponse,
|
||||||
|
22: pb.CoverStateResponse,
|
||||||
|
23: pb.FanStateResponse,
|
||||||
|
24: pb.LightStateResponse,
|
||||||
|
25: pb.SensorStateResponse,
|
||||||
|
26: pb.SwitchStateResponse,
|
||||||
|
27: pb.TextSensorStateResponse,
|
||||||
|
28: pb.SubscribeLogsRequest,
|
||||||
|
29: pb.SubscribeLogsResponse,
|
||||||
|
30: pb.CoverCommandRequest,
|
||||||
|
31: pb.FanCommandRequest,
|
||||||
|
32: pb.LightCommandRequest,
|
||||||
|
33: pb.SwitchCommandRequest,
|
||||||
|
34: pb.SubscribeServiceCallsRequest,
|
||||||
|
35: pb.ServiceCallResponse,
|
||||||
|
36: pb.GetTimeRequest,
|
||||||
|
37: pb.GetTimeResponse,
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def _varuint_to_bytes(value):
|
||||||
|
if value <= 0x7F:
|
||||||
|
return byte_to_bytes(value)
|
||||||
|
|
||||||
|
ret = bytes()
|
||||||
|
while value:
|
||||||
|
temp = value & 0x7F
|
||||||
|
value >>= 7
|
||||||
|
if value:
|
||||||
|
ret += byte_to_bytes(temp | 0x80)
|
||||||
|
else:
|
||||||
|
ret += byte_to_bytes(temp)
|
||||||
|
|
||||||
|
return ret
|
||||||
|
|
||||||
|
|
||||||
|
def _bytes_to_varuint(value):
|
||||||
|
result = 0
|
||||||
|
bitpos = 0
|
||||||
|
for c in value:
|
||||||
|
val = char_to_byte(c)
|
||||||
|
result |= (val & 0x7F) << bitpos
|
||||||
|
bitpos += 7
|
||||||
|
if (val & 0x80) == 0:
|
||||||
|
return result
|
||||||
|
return None
|
||||||
|
|
||||||
|
|
||||||
|
# pylint: disable=too-many-instance-attributes,not-callable
|
||||||
|
class APIClient(threading.Thread):
|
||||||
|
def __init__(self, address, port, password):
|
||||||
|
threading.Thread.__init__(self)
|
||||||
|
self._address = address # type: str
|
||||||
|
self._port = port # type: int
|
||||||
|
self._password = password # type: Optional[str]
|
||||||
|
self._socket = None # type: Optional[socket.socket]
|
||||||
|
self._socket_open_event = threading.Event()
|
||||||
|
self._socket_write_lock = threading.Lock()
|
||||||
|
self._connected = False
|
||||||
|
self._authenticated = False
|
||||||
|
self._message_handlers = []
|
||||||
|
self._keepalive = 5
|
||||||
|
self._ping_timer = None
|
||||||
|
self._refresh_ping()
|
||||||
|
|
||||||
|
self.on_disconnect = None
|
||||||
|
self.on_connect = None
|
||||||
|
self.on_login = None
|
||||||
|
self.auto_reconnect = False
|
||||||
|
self._running_event = threading.Event()
|
||||||
|
self._stop_event = threading.Event()
|
||||||
|
|
||||||
|
@property
|
||||||
|
def stopped(self):
|
||||||
|
return self._stop_event.is_set()
|
||||||
|
|
||||||
|
def _refresh_ping(self):
|
||||||
|
if self._ping_timer is not None:
|
||||||
|
self._ping_timer.cancel()
|
||||||
|
self._ping_timer = None
|
||||||
|
|
||||||
|
def func():
|
||||||
|
self._ping_timer = None
|
||||||
|
|
||||||
|
if self._connected:
|
||||||
|
try:
|
||||||
|
self.ping()
|
||||||
|
except APIConnectionError:
|
||||||
|
self._fatal_error()
|
||||||
|
else:
|
||||||
|
self._refresh_ping()
|
||||||
|
|
||||||
|
self._ping_timer = threading.Timer(self._keepalive, func)
|
||||||
|
self._ping_timer.start()
|
||||||
|
|
||||||
|
def _cancel_ping(self):
|
||||||
|
if self._ping_timer is not None:
|
||||||
|
self._ping_timer.cancel()
|
||||||
|
self._ping_timer = None
|
||||||
|
|
||||||
|
def _close_socket(self):
|
||||||
|
self._cancel_ping()
|
||||||
|
if self._socket is not None:
|
||||||
|
self._socket.close()
|
||||||
|
self._socket = None
|
||||||
|
self._socket_open_event.clear()
|
||||||
|
self._connected = False
|
||||||
|
self._authenticated = False
|
||||||
|
self._message_handlers = []
|
||||||
|
|
||||||
|
def stop(self, force=False):
|
||||||
|
if self.stopped:
|
||||||
|
raise ValueError
|
||||||
|
|
||||||
|
if self._connected and not force:
|
||||||
|
try:
|
||||||
|
self.disconnect()
|
||||||
|
except APIConnectionError:
|
||||||
|
pass
|
||||||
|
self._close_socket()
|
||||||
|
|
||||||
|
self._stop_event.set()
|
||||||
|
if not force:
|
||||||
|
self.join()
|
||||||
|
|
||||||
|
def connect(self):
|
||||||
|
if not self._running_event.wait(0.1):
|
||||||
|
raise APIConnectionError("You need to call start() first!")
|
||||||
|
|
||||||
|
if self._connected:
|
||||||
|
raise APIConnectionError("Already connected!")
|
||||||
|
|
||||||
|
try:
|
||||||
|
ip = resolve_ip_address(self._address)
|
||||||
|
except EsphomeyamlError as err:
|
||||||
|
_LOGGER.warning("Error resolving IP address of %s. Is it connected to WiFi?",
|
||||||
|
self._address)
|
||||||
|
_LOGGER.warning("(If this error persists, please set a static IP address: "
|
||||||
|
"https://esphomelib.com/esphomeyaml/components/wifi.html#manual-ips)")
|
||||||
|
raise APIConnectionError(err)
|
||||||
|
|
||||||
|
_LOGGER.info("Connecting to %s:%s (%s)", self._address, self._port, ip)
|
||||||
|
self._socket = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
|
||||||
|
self._socket.settimeout(10.0)
|
||||||
|
self._socket.setsockopt(socket.IPPROTO_TCP, socket.TCP_NODELAY, 1)
|
||||||
|
try:
|
||||||
|
self._socket.connect((ip, self._port))
|
||||||
|
except socket.error as err:
|
||||||
|
self._fatal_error()
|
||||||
|
raise APIConnectionError("Error connecting to {}: {}".format(ip, err))
|
||||||
|
self._socket.settimeout(0.1)
|
||||||
|
|
||||||
|
self._socket_open_event.set()
|
||||||
|
|
||||||
|
hello = pb.HelloRequest()
|
||||||
|
hello.client_info = 'esphomeyaml v{}'.format(const.__version__)
|
||||||
|
try:
|
||||||
|
resp = self._send_message_await_response(hello, pb.HelloResponse)
|
||||||
|
except APIConnectionError as err:
|
||||||
|
self._fatal_error()
|
||||||
|
raise err
|
||||||
|
_LOGGER.debug("Successfully connected to %s ('%s' API=%s.%s)", self._address,
|
||||||
|
resp.server_info, resp.api_version_major, resp.api_version_minor)
|
||||||
|
self._connected = True
|
||||||
|
if self.on_connect is not None:
|
||||||
|
self.on_connect()
|
||||||
|
|
||||||
|
def _check_connected(self):
|
||||||
|
if not self._connected:
|
||||||
|
self._fatal_error()
|
||||||
|
raise APIConnectionError("Must be connected!")
|
||||||
|
|
||||||
|
def login(self):
|
||||||
|
self._check_connected()
|
||||||
|
if self._authenticated:
|
||||||
|
raise APIConnectionError("Already logged in!")
|
||||||
|
|
||||||
|
connect = pb.ConnectRequest()
|
||||||
|
if self._password is not None:
|
||||||
|
connect.password = self._password
|
||||||
|
resp = self._send_message_await_response(connect, pb.ConnectResponse)
|
||||||
|
if resp.invalid_password:
|
||||||
|
raise APIConnectionError("Invalid password!")
|
||||||
|
|
||||||
|
self._authenticated = True
|
||||||
|
if self.on_login is not None:
|
||||||
|
self.on_login()
|
||||||
|
|
||||||
|
def _fatal_error(self):
|
||||||
|
was_connected = self._connected
|
||||||
|
|
||||||
|
self._close_socket()
|
||||||
|
|
||||||
|
if was_connected and self.on_disconnect is not None:
|
||||||
|
self.on_disconnect()
|
||||||
|
|
||||||
|
def _write(self, data): # type: (bytes) -> None
|
||||||
|
if self._socket is None:
|
||||||
|
raise APIConnectionError("Socket closed")
|
||||||
|
|
||||||
|
_LOGGER.debug("Write: %s", format_bytes(data))
|
||||||
|
with self._socket_write_lock:
|
||||||
|
try:
|
||||||
|
self._socket.sendall(data)
|
||||||
|
except socket.error as err:
|
||||||
|
self._fatal_error()
|
||||||
|
raise APIConnectionError("Error while writing data: {}".format(err))
|
||||||
|
|
||||||
|
def _send_message(self, msg):
|
||||||
|
# type: (message.Message) -> None
|
||||||
|
for message_type, klass in MESSAGE_TYPE_TO_PROTO.items():
|
||||||
|
if isinstance(msg, klass):
|
||||||
|
break
|
||||||
|
else:
|
||||||
|
raise ValueError
|
||||||
|
|
||||||
|
encoded = msg.SerializeToString()
|
||||||
|
_LOGGER.debug("Sending %s:\n%s", type(msg), indent(text_type(msg)))
|
||||||
|
if IS_PY2:
|
||||||
|
req = chr(0x00)
|
||||||
|
else:
|
||||||
|
req = bytes([0])
|
||||||
|
req += _varuint_to_bytes(len(encoded))
|
||||||
|
req += _varuint_to_bytes(message_type)
|
||||||
|
req += encoded
|
||||||
|
self._write(req)
|
||||||
|
self._refresh_ping()
|
||||||
|
|
||||||
|
def _send_message_await_response_complex(self, send_msg, do_append, do_stop, timeout=1):
|
||||||
|
event = threading.Event()
|
||||||
|
responses = []
|
||||||
|
|
||||||
|
def on_message(resp):
|
||||||
|
if do_append(resp):
|
||||||
|
responses.append(resp)
|
||||||
|
if do_stop(resp):
|
||||||
|
event.set()
|
||||||
|
|
||||||
|
self._message_handlers.append(on_message)
|
||||||
|
self._send_message(send_msg)
|
||||||
|
ret = event.wait(timeout)
|
||||||
|
try:
|
||||||
|
self._message_handlers.remove(on_message)
|
||||||
|
except ValueError:
|
||||||
|
pass
|
||||||
|
if not ret:
|
||||||
|
raise APIConnectionError("Timeout while waiting for message response!")
|
||||||
|
return responses
|
||||||
|
|
||||||
|
def _send_message_await_response(self, send_msg, response_type, timeout=1):
|
||||||
|
def is_response(msg):
|
||||||
|
return isinstance(msg, response_type)
|
||||||
|
|
||||||
|
return self._send_message_await_response_complex(send_msg, is_response, is_response,
|
||||||
|
timeout)[0]
|
||||||
|
|
||||||
|
def device_info(self):
|
||||||
|
self._check_connected()
|
||||||
|
return self._send_message_await_response(pb.DeviceInfoRequest(), pb.DeviceInfoResponse)
|
||||||
|
|
||||||
|
def ping(self):
|
||||||
|
self._check_connected()
|
||||||
|
return self._send_message_await_response(pb.PingRequest(), pb.PingResponse)
|
||||||
|
|
||||||
|
def disconnect(self):
|
||||||
|
self._check_connected()
|
||||||
|
|
||||||
|
try:
|
||||||
|
self._send_message_await_response(pb.DisconnectRequest(), pb.DisconnectResponse)
|
||||||
|
except APIConnectionError:
|
||||||
|
pass
|
||||||
|
self._close_socket()
|
||||||
|
|
||||||
|
if self.on_disconnect is not None:
|
||||||
|
self.on_disconnect()
|
||||||
|
|
||||||
|
def _check_authenticated(self):
|
||||||
|
if not self._authenticated:
|
||||||
|
raise APIConnectionError("Must login first!")
|
||||||
|
|
||||||
|
def subscribe_logs(self, on_log, log_level=None, dump_config=False):
|
||||||
|
self._check_authenticated()
|
||||||
|
|
||||||
|
def on_msg(msg):
|
||||||
|
if isinstance(msg, pb.SubscribeLogsResponse):
|
||||||
|
on_log(msg)
|
||||||
|
|
||||||
|
self._message_handlers.append(on_msg)
|
||||||
|
req = pb.SubscribeLogsRequest(dump_config=dump_config)
|
||||||
|
if log_level is not None:
|
||||||
|
req.level = log_level
|
||||||
|
self._send_message(req)
|
||||||
|
|
||||||
|
def _recv(self, amount):
|
||||||
|
ret = bytes()
|
||||||
|
if amount == 0:
|
||||||
|
return ret
|
||||||
|
|
||||||
|
while len(ret) < amount:
|
||||||
|
if self.stopped:
|
||||||
|
raise APIConnectionError("Stopped!")
|
||||||
|
if not self._socket_open_event.is_set():
|
||||||
|
raise APIConnectionError("No socket!")
|
||||||
|
try:
|
||||||
|
val = self._socket.recv(amount - len(ret))
|
||||||
|
except AttributeError:
|
||||||
|
raise APIConnectionError("Socket was closed")
|
||||||
|
except socket.timeout:
|
||||||
|
continue
|
||||||
|
except socket.error as err:
|
||||||
|
raise APIConnectionError("Error while receiving data: {}".format(err))
|
||||||
|
ret += val
|
||||||
|
return ret
|
||||||
|
|
||||||
|
def _recv_varint(self):
|
||||||
|
raw = bytes()
|
||||||
|
while not raw or char_to_byte(raw[-1]) & 0x80:
|
||||||
|
raw += self._recv(1)
|
||||||
|
return _bytes_to_varuint(raw)
|
||||||
|
|
||||||
|
def _run_once(self):
|
||||||
|
if not self._socket_open_event.wait(0.1):
|
||||||
|
return
|
||||||
|
|
||||||
|
# Preamble
|
||||||
|
if char_to_byte(self._recv(1)[0]) != 0x00:
|
||||||
|
raise APIConnectionError("Invalid preamble")
|
||||||
|
|
||||||
|
length = self._recv_varint()
|
||||||
|
msg_type = self._recv_varint()
|
||||||
|
|
||||||
|
raw_msg = self._recv(length)
|
||||||
|
if msg_type not in MESSAGE_TYPE_TO_PROTO:
|
||||||
|
_LOGGER.debug("Skipping message type %s", msg_type)
|
||||||
|
return
|
||||||
|
|
||||||
|
msg = MESSAGE_TYPE_TO_PROTO[msg_type]()
|
||||||
|
msg.ParseFromString(raw_msg)
|
||||||
|
_LOGGER.debug("Got message: %s:\n%s", type(msg), indent(str(msg)))
|
||||||
|
for msg_handler in self._message_handlers[:]:
|
||||||
|
msg_handler(msg)
|
||||||
|
self._handle_internal_messages(msg)
|
||||||
|
self._refresh_ping()
|
||||||
|
|
||||||
|
def run(self):
|
||||||
|
self._running_event.set()
|
||||||
|
while not self.stopped:
|
||||||
|
try:
|
||||||
|
self._run_once()
|
||||||
|
except APIConnectionError as err:
|
||||||
|
if self.stopped:
|
||||||
|
break
|
||||||
|
if self._connected:
|
||||||
|
_LOGGER.error("Error while reading incoming messages: %s", err)
|
||||||
|
self._fatal_error()
|
||||||
|
self._running_event.clear()
|
||||||
|
|
||||||
|
def _handle_internal_messages(self, msg):
|
||||||
|
if isinstance(msg, pb.DisconnectRequest):
|
||||||
|
self._send_message(pb.DisconnectResponse())
|
||||||
|
if self._socket is not None:
|
||||||
|
self._socket.close()
|
||||||
|
self._socket = None
|
||||||
|
self._connected = False
|
||||||
|
if self.on_disconnect is not None:
|
||||||
|
self.on_disconnect()
|
||||||
|
elif isinstance(msg, pb.PingRequest):
|
||||||
|
self._send_message(pb.PingResponse())
|
||||||
|
elif isinstance(msg, pb.GetTimeRequest):
|
||||||
|
resp = pb.GetTimeResponse()
|
||||||
|
resp.epoch_seconds = int(time.time())
|
||||||
|
self._send_message(resp)
|
||||||
|
|
||||||
|
|
||||||
|
def run_logs(config, address):
|
||||||
|
conf = config['api']
|
||||||
|
port = conf[CONF_PORT]
|
||||||
|
password = conf[CONF_PASSWORD]
|
||||||
|
_LOGGER.info("Starting log output from %s using esphomelib API", address)
|
||||||
|
|
||||||
|
cli = APIClient(address, port, password)
|
||||||
|
stopping = False
|
||||||
|
retry_timer = []
|
||||||
|
|
||||||
|
def try_connect(tries=0, is_disconnect=True):
|
||||||
|
if stopping:
|
||||||
|
return
|
||||||
|
|
||||||
|
if is_disconnect:
|
||||||
|
_LOGGER.warning(u"Disconnected from API.")
|
||||||
|
|
||||||
|
while retry_timer:
|
||||||
|
retry_timer.pop(0).cancel()
|
||||||
|
|
||||||
|
error = None
|
||||||
|
try:
|
||||||
|
cli.connect()
|
||||||
|
cli.login()
|
||||||
|
except APIConnectionError as err: # noqa
|
||||||
|
error = err
|
||||||
|
|
||||||
|
if error is None:
|
||||||
|
_LOGGER.info("Successfully connected to %s", address)
|
||||||
|
return
|
||||||
|
|
||||||
|
wait_time = min(2**tries, 300)
|
||||||
|
_LOGGER.warning(u"Couldn't connect to API (%s). Trying to reconnect in %s seconds",
|
||||||
|
error, wait_time)
|
||||||
|
timer = threading.Timer(wait_time, functools.partial(try_connect, tries + 1, is_disconnect))
|
||||||
|
timer.start()
|
||||||
|
retry_timer.append(timer)
|
||||||
|
|
||||||
|
def on_log(msg):
|
||||||
|
time_ = datetime.now().time().strftime(u'[%H:%M:%S]')
|
||||||
|
text = msg.message
|
||||||
|
if msg.send_failed:
|
||||||
|
text = color('white', '(Message skipped because it was too big to fit in '
|
||||||
|
'TCP buffer - This is only cosmetic)')
|
||||||
|
safe_print(time_ + text)
|
||||||
|
|
||||||
|
has_connects = []
|
||||||
|
|
||||||
|
def on_login():
|
||||||
|
try:
|
||||||
|
cli.subscribe_logs(on_log, dump_config=not has_connects)
|
||||||
|
has_connects.append(True)
|
||||||
|
except APIConnectionError:
|
||||||
|
cli.disconnect()
|
||||||
|
|
||||||
|
cli.on_disconnect = try_connect
|
||||||
|
cli.on_login = on_login
|
||||||
|
cli.start()
|
||||||
|
|
||||||
|
try:
|
||||||
|
try_connect(is_disconnect=False)
|
||||||
|
while True:
|
||||||
|
time.sleep(1)
|
||||||
|
except KeyboardInterrupt:
|
||||||
|
stopping = True
|
||||||
|
cli.stop(True)
|
||||||
|
while retry_timer:
|
||||||
|
retry_timer.pop(0).cancel()
|
||||||
|
return 0
|
|
@ -2,16 +2,15 @@ import copy
|
||||||
|
|
||||||
import voluptuous as vol
|
import voluptuous as vol
|
||||||
|
|
||||||
from esphomeyaml import core
|
|
||||||
import esphomeyaml.config_validation as cv
|
import esphomeyaml.config_validation as cv
|
||||||
from esphomeyaml.const import CONF_ABOVE, CONF_ACTION_ID, CONF_AND, CONF_AUTOMATION_ID, \
|
from esphomeyaml.const import CONF_ABOVE, CONF_ACTION_ID, CONF_AND, CONF_AUTOMATION_ID, \
|
||||||
CONF_BELOW, CONF_CONDITION, CONF_CONDITION_ID, CONF_DELAY, \
|
CONF_BELOW, CONF_CONDITION, CONF_CONDITION_ID, CONF_DELAY, CONF_ELSE, CONF_ID, CONF_IF, \
|
||||||
CONF_ELSE, CONF_ID, CONF_IF, CONF_LAMBDA, \
|
CONF_LAMBDA, CONF_OR, CONF_RANGE, CONF_THEN, CONF_TRIGGER_ID, CONF_WHILE
|
||||||
CONF_OR, CONF_RANGE, CONF_THEN, CONF_TRIGGER_ID
|
from esphomeyaml.core import CORE
|
||||||
from esphomeyaml.core import ESPHomeYAMLError
|
from esphomeyaml.cpp_generator import ArrayInitializer, Pvariable, TemplateArguments, add, \
|
||||||
from esphomeyaml.helpers import App, ArrayInitializer, Pvariable, TemplateArguments, add, add_job, \
|
get_variable, process_lambda, templatable
|
||||||
esphomelib_ns, float_, process_lambda, templatable, uint32, get_variable, PollingComponent, \
|
from esphomeyaml.cpp_types import Action, App, Component, PollingComponent, Trigger, \
|
||||||
Action, Component, Trigger
|
esphomelib_ns, float_, uint32, void, bool_
|
||||||
from esphomeyaml.util import ServiceRegistry
|
from esphomeyaml.util import ServiceRegistry
|
||||||
|
|
||||||
|
|
||||||
|
@ -27,41 +26,82 @@ def maybe_simple_id(*validators):
|
||||||
|
|
||||||
|
|
||||||
def validate_recursive_condition(value):
|
def validate_recursive_condition(value):
|
||||||
return CONDITIONS_SCHEMA(value)
|
is_list = isinstance(value, list)
|
||||||
|
value = cv.ensure_list()(value)[:]
|
||||||
|
for i, item in enumerate(value):
|
||||||
|
path = [i] if is_list else []
|
||||||
|
item = copy.deepcopy(item)
|
||||||
|
if not isinstance(item, dict):
|
||||||
|
raise vol.Invalid(u"Condition must consist of key-value mapping! Got {}".format(item),
|
||||||
|
path)
|
||||||
|
key = next((x for x in item if x != CONF_CONDITION_ID), None)
|
||||||
|
if key is None:
|
||||||
|
raise vol.Invalid(u"Key missing from action! Got {}".format(item), path)
|
||||||
|
if key not in CONDITION_REGISTRY:
|
||||||
|
raise vol.Invalid(u"Unable to find condition with the name '{}', is the "
|
||||||
|
u"component loaded?".format(key), path + [key])
|
||||||
|
item.setdefault(CONF_CONDITION_ID, None)
|
||||||
|
key2 = next((x for x in item if x not in (CONF_CONDITION_ID, key)), None)
|
||||||
|
if key2 is not None:
|
||||||
|
raise vol.Invalid(u"Cannot have two conditions in one item. Key '{}' overrides '{}'! "
|
||||||
|
u"Did you forget to indent the block inside the condition?"
|
||||||
|
u"".format(key, key2), path)
|
||||||
|
validator = CONDITION_REGISTRY[key][0]
|
||||||
|
try:
|
||||||
|
condition = validator(item[key])
|
||||||
|
except vol.Invalid as err:
|
||||||
|
err.prepend(path)
|
||||||
|
raise err
|
||||||
|
value[i] = {
|
||||||
|
CONF_CONDITION_ID: cv.declare_variable_id(Condition)(item[CONF_CONDITION_ID]),
|
||||||
|
key: condition,
|
||||||
|
}
|
||||||
|
return value
|
||||||
|
|
||||||
|
|
||||||
def validate_recursive_action(value):
|
def validate_recursive_action(value):
|
||||||
value = cv.ensure_list(value)[:]
|
is_list = isinstance(value, list)
|
||||||
|
if not is_list:
|
||||||
|
value = [value]
|
||||||
for i, item in enumerate(value):
|
for i, item in enumerate(value):
|
||||||
|
path = [i] if is_list else []
|
||||||
item = copy.deepcopy(item)
|
item = copy.deepcopy(item)
|
||||||
if not isinstance(item, dict):
|
if not isinstance(item, dict):
|
||||||
raise vol.Invalid(u"Action must consist of key-value mapping! Got {}".format(item))
|
raise vol.Invalid(u"Action must consist of key-value mapping! Got {}".format(item),
|
||||||
|
path)
|
||||||
key = next((x for x in item if x != CONF_ACTION_ID), None)
|
key = next((x for x in item if x != CONF_ACTION_ID), None)
|
||||||
if key is None:
|
if key is None:
|
||||||
raise vol.Invalid(u"Key missing from action! Got {}".format(item))
|
raise vol.Invalid(u"Key missing from action! Got {}".format(item), path)
|
||||||
if key not in ACTION_REGISTRY:
|
if key not in ACTION_REGISTRY:
|
||||||
raise vol.Invalid(u"Unable to find action with the name '{}', is the component loaded?"
|
raise vol.Invalid(u"Unable to find action with the name '{}', is the component loaded?"
|
||||||
u"".format(key))
|
u"".format(key), path + [key])
|
||||||
item.setdefault(CONF_ACTION_ID, None)
|
item.setdefault(CONF_ACTION_ID, None)
|
||||||
key2 = next((x for x in item if x != CONF_ACTION_ID and x != key), None)
|
key2 = next((x for x in item if x not in (CONF_ACTION_ID, key)), None)
|
||||||
if key2 is not None:
|
if key2 is not None:
|
||||||
raise vol.Invalid(u"Cannot have two actions in one item. Key '{}' overrides '{}'! "
|
raise vol.Invalid(u"Cannot have two actions in one item. Key '{}' overrides '{}'! "
|
||||||
u"Did you forget to indent the action?"
|
u"Did you forget to indent the block inside the action?"
|
||||||
u"".format(key, key2))
|
u"".format(key, key2), path)
|
||||||
validator = ACTION_REGISTRY[key][0]
|
validator = ACTION_REGISTRY[key][0]
|
||||||
|
try:
|
||||||
|
action = validator(item[key])
|
||||||
|
except vol.Invalid as err:
|
||||||
|
err.prepend(path)
|
||||||
|
raise err
|
||||||
value[i] = {
|
value[i] = {
|
||||||
CONF_ACTION_ID: cv.declare_variable_id(Action)(item[CONF_ACTION_ID]),
|
CONF_ACTION_ID: cv.declare_variable_id(Action)(item[CONF_ACTION_ID]),
|
||||||
key: validator(item[key])
|
key: action,
|
||||||
}
|
}
|
||||||
return value
|
return value
|
||||||
|
|
||||||
|
|
||||||
ACTION_REGISTRY = ServiceRegistry()
|
ACTION_REGISTRY = ServiceRegistry()
|
||||||
|
CONDITION_REGISTRY = ServiceRegistry()
|
||||||
|
|
||||||
# pylint: disable=invalid-name
|
# pylint: disable=invalid-name
|
||||||
DelayAction = esphomelib_ns.class_('DelayAction', Action, Component)
|
DelayAction = esphomelib_ns.class_('DelayAction', Action, Component)
|
||||||
LambdaAction = esphomelib_ns.class_('LambdaAction', Action)
|
LambdaAction = esphomelib_ns.class_('LambdaAction', Action)
|
||||||
IfAction = esphomelib_ns.class_('IfAction', Action)
|
IfAction = esphomelib_ns.class_('IfAction', Action)
|
||||||
|
WhileAction = esphomelib_ns.class_('WhileAction', Action)
|
||||||
UpdateComponentAction = esphomelib_ns.class_('UpdateComponentAction', Action)
|
UpdateComponentAction = esphomelib_ns.class_('UpdateComponentAction', Action)
|
||||||
Automation = esphomelib_ns.class_('Automation')
|
Automation = esphomelib_ns.class_('Automation')
|
||||||
|
|
||||||
|
@ -71,17 +111,6 @@ OrCondition = esphomelib_ns.class_('OrCondition', Condition)
|
||||||
RangeCondition = esphomelib_ns.class_('RangeCondition', Condition)
|
RangeCondition = esphomelib_ns.class_('RangeCondition', Condition)
|
||||||
LambdaCondition = esphomelib_ns.class_('LambdaCondition', Condition)
|
LambdaCondition = esphomelib_ns.class_('LambdaCondition', Condition)
|
||||||
|
|
||||||
CONDITIONS_SCHEMA = vol.All(cv.ensure_list, [cv.templatable({
|
|
||||||
cv.GenerateID(CONF_CONDITION_ID): cv.declare_variable_id(Condition),
|
|
||||||
vol.Optional(CONF_AND): validate_recursive_condition,
|
|
||||||
vol.Optional(CONF_OR): validate_recursive_condition,
|
|
||||||
vol.Optional(CONF_RANGE): vol.All(vol.Schema({
|
|
||||||
vol.Optional(CONF_ABOVE): vol.Coerce(float),
|
|
||||||
vol.Optional(CONF_BELOW): vol.Coerce(float),
|
|
||||||
}), cv.has_at_least_one_key(CONF_ABOVE, CONF_BELOW)),
|
|
||||||
vol.Optional(CONF_LAMBDA): cv.lambda_,
|
|
||||||
})])
|
|
||||||
|
|
||||||
|
|
||||||
def validate_automation(extra_schema=None, extra_validators=None, single=False):
|
def validate_automation(extra_schema=None, extra_validators=None, single=False):
|
||||||
schema = AUTOMATION_SCHEMA.extend(extra_schema or {})
|
schema = AUTOMATION_SCHEMA.extend(extra_schema or {})
|
||||||
|
@ -122,63 +151,63 @@ def validate_automation(extra_schema=None, extra_validators=None, single=False):
|
||||||
AUTOMATION_SCHEMA = vol.Schema({
|
AUTOMATION_SCHEMA = vol.Schema({
|
||||||
cv.GenerateID(CONF_TRIGGER_ID): cv.declare_variable_id(Trigger),
|
cv.GenerateID(CONF_TRIGGER_ID): cv.declare_variable_id(Trigger),
|
||||||
cv.GenerateID(CONF_AUTOMATION_ID): cv.declare_variable_id(Automation),
|
cv.GenerateID(CONF_AUTOMATION_ID): cv.declare_variable_id(Automation),
|
||||||
vol.Optional(CONF_IF): CONDITIONS_SCHEMA,
|
vol.Optional(CONF_IF): validate_recursive_condition,
|
||||||
vol.Required(CONF_THEN): validate_recursive_action,
|
vol.Required(CONF_THEN): validate_recursive_action,
|
||||||
})
|
})
|
||||||
|
|
||||||
|
AND_CONDITION_SCHEMA = validate_recursive_condition
|
||||||
|
|
||||||
def build_condition(config, arg_type):
|
|
||||||
template_arg = TemplateArguments(arg_type)
|
@CONDITION_REGISTRY.register(CONF_AND, AND_CONDITION_SCHEMA)
|
||||||
if isinstance(config, core.Lambda):
|
def and_condition_to_code(config, condition_id, arg_type, template_arg):
|
||||||
lambda_ = None
|
for conditions in build_conditions(config, arg_type):
|
||||||
for lambda_ in process_lambda(config, [(arg_type, 'x')]):
|
|
||||||
yield
|
yield
|
||||||
yield LambdaCondition.new(template_arg, lambda_)
|
rhs = AndCondition.new(template_arg, conditions)
|
||||||
elif CONF_AND in config:
|
type = AndCondition.template(template_arg)
|
||||||
yield AndCondition.new(template_arg, build_conditions(config[CONF_AND], template_arg))
|
yield Pvariable(condition_id, rhs, type=type)
|
||||||
elif CONF_OR in config:
|
|
||||||
yield OrCondition.new(template_arg, build_conditions(config[CONF_OR], template_arg))
|
|
||||||
elif CONF_LAMBDA in config:
|
OR_CONDITION_SCHEMA = validate_recursive_condition
|
||||||
lambda_ = None
|
|
||||||
for lambda_ in process_lambda(config[CONF_LAMBDA], [(arg_type, 'x')]):
|
|
||||||
|
@CONDITION_REGISTRY.register(CONF_OR, OR_CONDITION_SCHEMA)
|
||||||
|
def or_condition_to_code(config, condition_id, arg_type, template_arg):
|
||||||
|
for conditions in build_conditions(config, arg_type):
|
||||||
yield
|
yield
|
||||||
yield LambdaCondition.new(template_arg, lambda_)
|
rhs = OrCondition.new(template_arg, conditions)
|
||||||
elif CONF_RANGE in config:
|
type = OrCondition.template(template_arg)
|
||||||
conf = config[CONF_RANGE]
|
yield Pvariable(condition_id, rhs, type=type)
|
||||||
rhs = RangeCondition.new(template_arg)
|
|
||||||
|
|
||||||
|
RANGE_CONDITION_SCHEMA = vol.All(vol.Schema({
|
||||||
|
vol.Optional(CONF_ABOVE): cv.templatable(cv.float_),
|
||||||
|
vol.Optional(CONF_BELOW): cv.templatable(cv.float_),
|
||||||
|
}), cv.has_at_least_one_key(CONF_ABOVE, CONF_BELOW))
|
||||||
|
|
||||||
|
|
||||||
|
@CONDITION_REGISTRY.register(CONF_RANGE, RANGE_CONDITION_SCHEMA)
|
||||||
|
def range_condition_to_code(config, condition_id, arg_type, template_arg):
|
||||||
|
for conditions in build_conditions(config, arg_type):
|
||||||
|
yield
|
||||||
|
rhs = RangeCondition.new(template_arg, conditions)
|
||||||
type = RangeCondition.template(template_arg)
|
type = RangeCondition.template(template_arg)
|
||||||
condition = Pvariable(config[CONF_CONDITION_ID], rhs, type=type)
|
condition = Pvariable(condition_id, rhs, type=type)
|
||||||
if CONF_ABOVE in conf:
|
if CONF_ABOVE in config:
|
||||||
template_ = None
|
for template_ in templatable(config[CONF_ABOVE], arg_type, float_):
|
||||||
for template_ in templatable(conf[CONF_ABOVE], arg_type, float_):
|
|
||||||
yield
|
yield
|
||||||
condition.set_min(template_)
|
condition.set_min(template_)
|
||||||
if CONF_BELOW in conf:
|
if CONF_BELOW in config:
|
||||||
template_ = None
|
for template_ in templatable(config[CONF_BELOW], arg_type, float_):
|
||||||
for template_ in templatable(conf[CONF_BELOW], arg_type, float_):
|
|
||||||
yield
|
yield
|
||||||
condition.set_max(template_)
|
condition.set_max(template_)
|
||||||
yield condition
|
yield condition
|
||||||
else:
|
|
||||||
raise ESPHomeYAMLError(u"Unsupported condition {}".format(config))
|
|
||||||
|
|
||||||
|
|
||||||
def build_conditions(config, arg_type):
|
|
||||||
conditions = []
|
|
||||||
for conf in config:
|
|
||||||
condition = None
|
|
||||||
for condition in build_condition(conf, arg_type):
|
|
||||||
yield None
|
|
||||||
conditions.append(condition)
|
|
||||||
yield ArrayInitializer(*conditions)
|
|
||||||
|
|
||||||
|
|
||||||
DELAY_ACTION_SCHEMA = cv.templatable(cv.positive_time_period_milliseconds)
|
DELAY_ACTION_SCHEMA = cv.templatable(cv.positive_time_period_milliseconds)
|
||||||
|
|
||||||
|
|
||||||
@ACTION_REGISTRY.register(CONF_DELAY, DELAY_ACTION_SCHEMA)
|
@ACTION_REGISTRY.register(CONF_DELAY, DELAY_ACTION_SCHEMA)
|
||||||
def delay_action_to_code(config, action_id, arg_type):
|
def delay_action_to_code(config, action_id, arg_type, template_arg):
|
||||||
template_arg = TemplateArguments(arg_type)
|
|
||||||
rhs = App.register_component(DelayAction.new(template_arg))
|
rhs = App.register_component(DelayAction.new(template_arg))
|
||||||
type = DelayAction.template(template_arg)
|
type = DelayAction.template(template_arg)
|
||||||
action = Pvariable(action_id, rhs, type=type)
|
action = Pvariable(action_id, rhs, type=type)
|
||||||
|
@ -196,8 +225,7 @@ IF_ACTION_SCHEMA = vol.All({
|
||||||
|
|
||||||
|
|
||||||
@ACTION_REGISTRY.register(CONF_IF, IF_ACTION_SCHEMA)
|
@ACTION_REGISTRY.register(CONF_IF, IF_ACTION_SCHEMA)
|
||||||
def if_action_to_code(config, action_id, arg_type):
|
def if_action_to_code(config, action_id, arg_type, template_arg):
|
||||||
template_arg = TemplateArguments(arg_type)
|
|
||||||
for conditions in build_conditions(config[CONF_CONDITION], arg_type):
|
for conditions in build_conditions(config[CONF_CONDITION], arg_type):
|
||||||
yield None
|
yield None
|
||||||
rhs = IfAction.new(template_arg, conditions)
|
rhs = IfAction.new(template_arg, conditions)
|
||||||
|
@ -214,19 +242,49 @@ def if_action_to_code(config, action_id, arg_type):
|
||||||
yield action
|
yield action
|
||||||
|
|
||||||
|
|
||||||
|
WHILE_ACTION_SCHEMA = vol.Schema({
|
||||||
|
vol.Required(CONF_CONDITION): validate_recursive_condition,
|
||||||
|
vol.Required(CONF_THEN): validate_recursive_action,
|
||||||
|
})
|
||||||
|
|
||||||
|
|
||||||
|
@ACTION_REGISTRY.register(CONF_WHILE, WHILE_ACTION_SCHEMA)
|
||||||
|
def while_action_to_code(config, action_id, arg_type, template_arg):
|
||||||
|
for conditions in build_conditions(config[CONF_CONDITION], arg_type):
|
||||||
|
yield None
|
||||||
|
rhs = WhileAction.new(template_arg, conditions)
|
||||||
|
type = WhileAction.template(template_arg)
|
||||||
|
action = Pvariable(action_id, rhs, type=type)
|
||||||
|
for actions in build_actions(config[CONF_THEN], arg_type):
|
||||||
|
yield None
|
||||||
|
add(action.add_then(actions))
|
||||||
|
yield action
|
||||||
|
|
||||||
|
|
||||||
LAMBDA_ACTION_SCHEMA = cv.lambda_
|
LAMBDA_ACTION_SCHEMA = cv.lambda_
|
||||||
|
|
||||||
|
|
||||||
@ACTION_REGISTRY.register(CONF_LAMBDA, LAMBDA_ACTION_SCHEMA)
|
@ACTION_REGISTRY.register(CONF_LAMBDA, LAMBDA_ACTION_SCHEMA)
|
||||||
def lambda_action_to_code(config, action_id, arg_type):
|
def lambda_action_to_code(config, action_id, arg_type, template_arg):
|
||||||
template_arg = TemplateArguments(arg_type)
|
for lambda_ in process_lambda(config, [(arg_type, 'x')], return_type=void):
|
||||||
for lambda_ in process_lambda(config, [(arg_type, 'x')]):
|
|
||||||
yield None
|
yield None
|
||||||
rhs = LambdaAction.new(template_arg, lambda_)
|
rhs = LambdaAction.new(template_arg, lambda_)
|
||||||
type = LambdaAction.template(template_arg)
|
type = LambdaAction.template(template_arg)
|
||||||
yield Pvariable(action_id, rhs, type=type)
|
yield Pvariable(action_id, rhs, type=type)
|
||||||
|
|
||||||
|
|
||||||
|
LAMBDA_CONDITION_SCHEMA = cv.lambda_
|
||||||
|
|
||||||
|
|
||||||
|
@CONDITION_REGISTRY.register(CONF_LAMBDA, LAMBDA_CONDITION_SCHEMA)
|
||||||
|
def lambda_condition_to_code(config, condition_id, arg_type, template_arg):
|
||||||
|
for lambda_ in process_lambda(config, [(arg_type, 'x')], return_type=bool_):
|
||||||
|
yield
|
||||||
|
rhs = LambdaCondition.new(template_arg, lambda_)
|
||||||
|
type = LambdaCondition.template(template_arg)
|
||||||
|
yield Pvariable(condition_id, rhs, type=type)
|
||||||
|
|
||||||
|
|
||||||
CONF_COMPONENT_UPDATE = 'component.update'
|
CONF_COMPONENT_UPDATE = 'component.update'
|
||||||
COMPONENT_UPDATE_ACTION_SCHEMA = maybe_simple_id({
|
COMPONENT_UPDATE_ACTION_SCHEMA = maybe_simple_id({
|
||||||
vol.Required(CONF_ID): cv.use_variable_id(PollingComponent),
|
vol.Required(CONF_ID): cv.use_variable_id(PollingComponent),
|
||||||
|
@ -234,8 +292,7 @@ COMPONENT_UPDATE_ACTION_SCHEMA = maybe_simple_id({
|
||||||
|
|
||||||
|
|
||||||
@ACTION_REGISTRY.register(CONF_COMPONENT_UPDATE, COMPONENT_UPDATE_ACTION_SCHEMA)
|
@ACTION_REGISTRY.register(CONF_COMPONENT_UPDATE, COMPONENT_UPDATE_ACTION_SCHEMA)
|
||||||
def component_update_action_to_code(config, action_id, arg_type):
|
def component_update_action_to_code(config, action_id, arg_type, template_arg):
|
||||||
template_arg = TemplateArguments(arg_type)
|
|
||||||
for var in get_variable(config[CONF_ID]):
|
for var in get_variable(config[CONF_ID]):
|
||||||
yield None
|
yield None
|
||||||
rhs = UpdateComponentAction.new(var)
|
rhs = UpdateComponentAction.new(var)
|
||||||
|
@ -248,7 +305,8 @@ def build_action(full_config, arg_type):
|
||||||
key, config = next((k, v) for k, v in full_config.items() if k in ACTION_REGISTRY)
|
key, config = next((k, v) for k, v in full_config.items() if k in ACTION_REGISTRY)
|
||||||
|
|
||||||
builder = ACTION_REGISTRY[key][1]
|
builder = ACTION_REGISTRY[key][1]
|
||||||
for result in builder(config, action_id, arg_type):
|
template_arg = TemplateArguments(arg_type)
|
||||||
|
for result in builder(config, action_id, arg_type, template_arg):
|
||||||
yield None
|
yield None
|
||||||
yield result
|
yield result
|
||||||
|
|
||||||
|
@ -263,6 +321,26 @@ def build_actions(config, arg_type):
|
||||||
yield ArrayInitializer(*actions, multiline=False)
|
yield ArrayInitializer(*actions, multiline=False)
|
||||||
|
|
||||||
|
|
||||||
|
def build_condition(full_config, arg_type):
|
||||||
|
action_id = full_config[CONF_CONDITION_ID]
|
||||||
|
key, config = next((k, v) for k, v in full_config.items() if k in CONDITION_REGISTRY)
|
||||||
|
|
||||||
|
builder = CONDITION_REGISTRY[key][1]
|
||||||
|
template_arg = TemplateArguments(arg_type)
|
||||||
|
for result in builder(config, action_id, arg_type, template_arg):
|
||||||
|
yield None
|
||||||
|
yield result
|
||||||
|
|
||||||
|
|
||||||
|
def build_conditions(config, arg_type):
|
||||||
|
conditions = []
|
||||||
|
for conf in config:
|
||||||
|
for condition in build_condition(conf, arg_type):
|
||||||
|
yield None
|
||||||
|
conditions.append(condition)
|
||||||
|
yield ArrayInitializer(*conditions, multiline=False)
|
||||||
|
|
||||||
|
|
||||||
def build_automation_(trigger, arg_type, config):
|
def build_automation_(trigger, arg_type, config):
|
||||||
rhs = App.make_automation(TemplateArguments(arg_type), trigger)
|
rhs = App.make_automation(TemplateArguments(arg_type), trigger)
|
||||||
type = Automation.template(arg_type)
|
type = Automation.template(arg_type)
|
||||||
|
@ -280,4 +358,4 @@ def build_automation_(trigger, arg_type, config):
|
||||||
|
|
||||||
|
|
||||||
def build_automation(trigger, arg_type, config):
|
def build_automation(trigger, arg_type, config):
|
||||||
add_job(build_automation_, trigger, arg_type, config)
|
CORE.add_job(build_automation_, trigger, arg_type, config)
|
||||||
|
|
|
@ -1,27 +1,27 @@
|
||||||
import voluptuous as vol
|
import voluptuous as vol
|
||||||
|
|
||||||
from esphomeyaml.components import sensor, i2c
|
from esphomeyaml.components import i2c, sensor
|
||||||
import esphomeyaml.config_validation as cv
|
import esphomeyaml.config_validation as cv
|
||||||
from esphomeyaml.const import CONF_ADDRESS, CONF_ID
|
from esphomeyaml.const import CONF_ADDRESS, CONF_ID
|
||||||
from esphomeyaml.helpers import App, Pvariable, setup_component, Component
|
from esphomeyaml.cpp_generator import Pvariable
|
||||||
|
from esphomeyaml.cpp_helpers import setup_component
|
||||||
|
from esphomeyaml.cpp_types import App, Component
|
||||||
|
|
||||||
DEPENDENCIES = ['i2c']
|
DEPENDENCIES = ['i2c']
|
||||||
|
MULTI_CONF = True
|
||||||
|
|
||||||
ADS1115Component = sensor.sensor_ns.class_('ADS1115Component', Component, i2c.I2CDevice)
|
ADS1115Component = sensor.sensor_ns.class_('ADS1115Component', Component, i2c.I2CDevice)
|
||||||
|
|
||||||
ADS1115_SCHEMA = vol.Schema({
|
CONFIG_SCHEMA = vol.Schema({
|
||||||
cv.GenerateID(): cv.declare_variable_id(ADS1115Component),
|
cv.GenerateID(): cv.declare_variable_id(ADS1115Component),
|
||||||
vol.Required(CONF_ADDRESS): cv.i2c_address,
|
vol.Required(CONF_ADDRESS): cv.i2c_address,
|
||||||
}).extend(cv.COMPONENT_SCHEMA.schema)
|
}).extend(cv.COMPONENT_SCHEMA.schema)
|
||||||
|
|
||||||
CONFIG_SCHEMA = vol.All(cv.ensure_list, [ADS1115_SCHEMA])
|
|
||||||
|
|
||||||
|
|
||||||
def to_code(config):
|
def to_code(config):
|
||||||
for conf in config:
|
rhs = App.make_ads1115_component(config[CONF_ADDRESS])
|
||||||
rhs = App.make_ads1115_component(conf[CONF_ADDRESS])
|
var = Pvariable(config[CONF_ID], rhs)
|
||||||
var = Pvariable(conf[CONF_ID], rhs)
|
setup_component(var, config)
|
||||||
setup_component(var, conf)
|
|
||||||
|
|
||||||
|
|
||||||
BUILD_FLAGS = '-DUSE_ADS1115_SENSOR'
|
BUILD_FLAGS = '-DUSE_ADS1115_SENSOR'
|
||||||
|
|
33
esphomeyaml/components/apds9960.py
Normal file
|
@ -0,0 +1,33 @@
|
||||||
|
import voluptuous as vol
|
||||||
|
|
||||||
|
from esphomeyaml.components import i2c, sensor
|
||||||
|
import esphomeyaml.config_validation as cv
|
||||||
|
from esphomeyaml.const import CONF_ADDRESS, CONF_ID, CONF_UPDATE_INTERVAL
|
||||||
|
from esphomeyaml.cpp_generator import Pvariable, add
|
||||||
|
from esphomeyaml.cpp_helpers import setup_component
|
||||||
|
from esphomeyaml.cpp_types import App, PollingComponent
|
||||||
|
|
||||||
|
DEPENDENCIES = ['i2c']
|
||||||
|
MULTI_CONF = True
|
||||||
|
|
||||||
|
CONF_APDS9960_ID = 'apds9960_id'
|
||||||
|
APDS9960 = sensor.sensor_ns.class_('APDS9960', PollingComponent, i2c.I2CDevice)
|
||||||
|
|
||||||
|
CONFIG_SCHEMA = vol.Schema({
|
||||||
|
cv.GenerateID(): cv.declare_variable_id(APDS9960),
|
||||||
|
vol.Optional(CONF_ADDRESS): cv.i2c_address,
|
||||||
|
vol.Optional(CONF_UPDATE_INTERVAL): cv.update_interval,
|
||||||
|
}).extend(cv.COMPONENT_SCHEMA.schema)
|
||||||
|
|
||||||
|
|
||||||
|
def to_code(config):
|
||||||
|
rhs = App.make_apds9960(config.get(CONF_UPDATE_INTERVAL))
|
||||||
|
var = Pvariable(config[CONF_ID], rhs)
|
||||||
|
|
||||||
|
if CONF_ADDRESS in config:
|
||||||
|
add(var.set_address(config[CONF_ADDRESS]))
|
||||||
|
|
||||||
|
setup_component(var, config)
|
||||||
|
|
||||||
|
|
||||||
|
BUILD_FLAGS = '-DUSE_APDS9960'
|
88
esphomeyaml/components/api.py
Normal file
|
@ -0,0 +1,88 @@
|
||||||
|
import voluptuous as vol
|
||||||
|
|
||||||
|
from esphomeyaml.automation import ACTION_REGISTRY
|
||||||
|
import esphomeyaml.config_validation as cv
|
||||||
|
from esphomeyaml.const import CONF_DATA, CONF_DATA_TEMPLATE, CONF_ID, CONF_PASSWORD, CONF_PORT, \
|
||||||
|
CONF_SERVICE, CONF_VARIABLES, CONF_REBOOT_TIMEOUT
|
||||||
|
from esphomeyaml.core import CORE
|
||||||
|
from esphomeyaml.cpp_generator import ArrayInitializer, Pvariable, add, get_variable, process_lambda
|
||||||
|
from esphomeyaml.cpp_helpers import setup_component
|
||||||
|
from esphomeyaml.cpp_types import Action, App, Component, StoringController, esphomelib_ns
|
||||||
|
|
||||||
|
api_ns = esphomelib_ns.namespace('api')
|
||||||
|
APIServer = api_ns.class_('APIServer', Component, StoringController)
|
||||||
|
HomeAssistantServiceCallAction = api_ns.class_('HomeAssistantServiceCallAction', Action)
|
||||||
|
KeyValuePair = api_ns.class_('KeyValuePair')
|
||||||
|
TemplatableKeyValuePair = api_ns.class_('TemplatableKeyValuePair')
|
||||||
|
|
||||||
|
CONFIG_SCHEMA = vol.Schema({
|
||||||
|
cv.GenerateID(): cv.declare_variable_id(APIServer),
|
||||||
|
vol.Optional(CONF_PORT, default=6053): cv.port,
|
||||||
|
vol.Optional(CONF_PASSWORD, default=''): cv.string_strict,
|
||||||
|
vol.Optional(CONF_REBOOT_TIMEOUT): cv.positive_time_period_milliseconds,
|
||||||
|
}).extend(cv.COMPONENT_SCHEMA.schema)
|
||||||
|
|
||||||
|
|
||||||
|
def to_code(config):
|
||||||
|
rhs = App.init_api_server()
|
||||||
|
api = Pvariable(config[CONF_ID], rhs)
|
||||||
|
|
||||||
|
if config[CONF_PORT] != 6053:
|
||||||
|
add(api.set_port(config[CONF_PORT]))
|
||||||
|
if config.get(CONF_PASSWORD):
|
||||||
|
add(api.set_password(config[CONF_PASSWORD]))
|
||||||
|
if CONF_REBOOT_TIMEOUT in config:
|
||||||
|
add(api.set_reboot_timeout(config[CONF_REBOOT_TIMEOUT]))
|
||||||
|
|
||||||
|
setup_component(api, config)
|
||||||
|
|
||||||
|
|
||||||
|
BUILD_FLAGS = '-DUSE_API'
|
||||||
|
|
||||||
|
|
||||||
|
def lib_deps(config):
|
||||||
|
if CORE.is_esp32:
|
||||||
|
return 'AsyncTCP@1.0.1'
|
||||||
|
if CORE.is_esp8266:
|
||||||
|
return 'ESPAsyncTCP@1.1.3'
|
||||||
|
raise NotImplementedError
|
||||||
|
|
||||||
|
|
||||||
|
CONF_HOMEASSISTANT_SERVICE = 'homeassistant.service'
|
||||||
|
HOMEASSISTANT_SERVIC_ACTION_SCHEMA = vol.Schema({
|
||||||
|
cv.GenerateID(): cv.use_variable_id(APIServer),
|
||||||
|
vol.Required(CONF_SERVICE): cv.string,
|
||||||
|
vol.Optional(CONF_DATA): vol.Schema({
|
||||||
|
cv.string: cv.string,
|
||||||
|
}),
|
||||||
|
vol.Optional(CONF_DATA_TEMPLATE): vol.Schema({
|
||||||
|
cv.string: cv.string,
|
||||||
|
}),
|
||||||
|
vol.Optional(CONF_VARIABLES): vol.Schema({
|
||||||
|
cv.string: cv.lambda_,
|
||||||
|
}),
|
||||||
|
})
|
||||||
|
|
||||||
|
|
||||||
|
@ACTION_REGISTRY.register(CONF_HOMEASSISTANT_SERVICE, HOMEASSISTANT_SERVIC_ACTION_SCHEMA)
|
||||||
|
def homeassistant_service_to_code(config, action_id, arg_type, template_arg):
|
||||||
|
for var in get_variable(config[CONF_ID]):
|
||||||
|
yield None
|
||||||
|
rhs = var.make_home_assistant_service_call_action(template_arg)
|
||||||
|
type = HomeAssistantServiceCallAction.template(arg_type)
|
||||||
|
act = Pvariable(action_id, rhs, type=type)
|
||||||
|
add(act.set_service(config[CONF_SERVICE]))
|
||||||
|
if CONF_DATA in config:
|
||||||
|
datas = [KeyValuePair(k, v) for k, v in config[CONF_DATA].items()]
|
||||||
|
add(act.set_data(ArrayInitializer(*datas)))
|
||||||
|
if CONF_DATA_TEMPLATE in config:
|
||||||
|
datas = [KeyValuePair(k, v) for k, v in config[CONF_DATA_TEMPLATE].items()]
|
||||||
|
add(act.set_data_template(ArrayInitializer(*datas)))
|
||||||
|
if CONF_VARIABLES in config:
|
||||||
|
datas = []
|
||||||
|
for key, value in config[CONF_VARIABLES].items():
|
||||||
|
for value_ in process_lambda(value, []):
|
||||||
|
yield None
|
||||||
|
datas.append(TemplatableKeyValuePair(key, value_))
|
||||||
|
add(act.set_variables(ArrayInitializer(*datas)))
|
||||||
|
yield act
|
|
@ -1,16 +1,21 @@
|
||||||
import voluptuous as vol
|
import voluptuous as vol
|
||||||
|
|
||||||
from esphomeyaml import automation, core
|
from esphomeyaml import automation, core
|
||||||
|
from esphomeyaml.automation import maybe_simple_id, CONDITION_REGISTRY, Condition
|
||||||
from esphomeyaml.components import mqtt
|
from esphomeyaml.components import mqtt
|
||||||
|
from esphomeyaml.components.mqtt import setup_mqtt_component
|
||||||
import esphomeyaml.config_validation as cv
|
import esphomeyaml.config_validation as cv
|
||||||
from esphomeyaml.const import CONF_DELAYED_OFF, CONF_DELAYED_ON, CONF_DEVICE_CLASS, CONF_FILTERS, \
|
from esphomeyaml.const import CONF_DELAYED_OFF, CONF_DELAYED_ON, CONF_DEVICE_CLASS, CONF_FILTERS, \
|
||||||
CONF_HEARTBEAT, CONF_ID, CONF_INTERNAL, CONF_INVALID_COOLDOWN, CONF_INVERT, CONF_INVERTED, \
|
CONF_HEARTBEAT, CONF_ID, CONF_INTERNAL, CONF_INVALID_COOLDOWN, CONF_INVERT, CONF_INVERTED, \
|
||||||
CONF_LAMBDA, CONF_MAX_LENGTH, CONF_MIN_LENGTH, CONF_MQTT_ID, CONF_ON_CLICK, \
|
CONF_LAMBDA, CONF_MAX_LENGTH, CONF_MIN_LENGTH, CONF_MQTT_ID, CONF_ON_CLICK, \
|
||||||
CONF_ON_DOUBLE_CLICK, CONF_ON_MULTI_CLICK, CONF_ON_PRESS, CONF_ON_RELEASE, CONF_STATE, \
|
CONF_ON_DOUBLE_CLICK, CONF_ON_MULTI_CLICK, CONF_ON_PRESS, CONF_ON_RELEASE, CONF_STATE, \
|
||||||
CONF_TIMING, CONF_TRIGGER_ID
|
CONF_TIMING, CONF_TRIGGER_ID, CONF_ON_STATE
|
||||||
from esphomeyaml.helpers import App, ArrayInitializer, NoArg, Pvariable, StructInitializer, add, \
|
from esphomeyaml.core import CORE
|
||||||
add_job, bool_, esphomelib_ns, process_lambda, setup_mqtt_component, Nameable, Trigger, \
|
from esphomeyaml.cpp_generator import process_lambda, ArrayInitializer, add, Pvariable, \
|
||||||
Component
|
StructInitializer, get_variable
|
||||||
|
from esphomeyaml.cpp_types import esphomelib_ns, Nameable, Trigger, NoArg, Component, App, bool_, \
|
||||||
|
optional
|
||||||
|
from esphomeyaml.py_compat import string_types
|
||||||
|
|
||||||
DEVICE_CLASSES = [
|
DEVICE_CLASSES = [
|
||||||
'', 'battery', 'cold', 'connectivity', 'door', 'garage_door', 'gas',
|
'', 'battery', 'cold', 'connectivity', 'door', 'garage_door', 'gas',
|
||||||
|
@ -25,6 +30,7 @@ PLATFORM_SCHEMA = cv.PLATFORM_SCHEMA.extend({
|
||||||
|
|
||||||
binary_sensor_ns = esphomelib_ns.namespace('binary_sensor')
|
binary_sensor_ns = esphomelib_ns.namespace('binary_sensor')
|
||||||
BinarySensor = binary_sensor_ns.class_('BinarySensor', Nameable)
|
BinarySensor = binary_sensor_ns.class_('BinarySensor', Nameable)
|
||||||
|
BinarySensorPtr = BinarySensor.operator('ptr')
|
||||||
MQTTBinarySensorComponent = binary_sensor_ns.class_('MQTTBinarySensorComponent', mqtt.MQTTComponent)
|
MQTTBinarySensorComponent = binary_sensor_ns.class_('MQTTBinarySensorComponent', mqtt.MQTTComponent)
|
||||||
|
|
||||||
# Triggers
|
# Triggers
|
||||||
|
@ -34,6 +40,10 @@ ClickTrigger = binary_sensor_ns.class_('ClickTrigger', Trigger.template(NoArg))
|
||||||
DoubleClickTrigger = binary_sensor_ns.class_('DoubleClickTrigger', Trigger.template(NoArg))
|
DoubleClickTrigger = binary_sensor_ns.class_('DoubleClickTrigger', Trigger.template(NoArg))
|
||||||
MultiClickTrigger = binary_sensor_ns.class_('MultiClickTrigger', Trigger.template(NoArg), Component)
|
MultiClickTrigger = binary_sensor_ns.class_('MultiClickTrigger', Trigger.template(NoArg), Component)
|
||||||
MultiClickTriggerEvent = binary_sensor_ns.struct('MultiClickTriggerEvent')
|
MultiClickTriggerEvent = binary_sensor_ns.struct('MultiClickTriggerEvent')
|
||||||
|
StateTrigger = binary_sensor_ns.class_('StateTrigger', Trigger.template(bool_))
|
||||||
|
|
||||||
|
# Condition
|
||||||
|
BinarySensorCondition = binary_sensor_ns.class_('BinarySensorCondition', Condition)
|
||||||
|
|
||||||
# Filters
|
# Filters
|
||||||
Filter = binary_sensor_ns.class_('Filter')
|
Filter = binary_sensor_ns.class_('Filter')
|
||||||
|
@ -46,13 +56,13 @@ LambdaFilter = binary_sensor_ns.class_('LambdaFilter', Filter)
|
||||||
|
|
||||||
FILTER_KEYS = [CONF_INVERT, CONF_DELAYED_ON, CONF_DELAYED_OFF, CONF_LAMBDA, CONF_HEARTBEAT]
|
FILTER_KEYS = [CONF_INVERT, CONF_DELAYED_ON, CONF_DELAYED_OFF, CONF_LAMBDA, CONF_HEARTBEAT]
|
||||||
|
|
||||||
FILTERS_SCHEMA = vol.All(cv.ensure_list, [vol.All({
|
FILTERS_SCHEMA = cv.ensure_list({
|
||||||
vol.Optional(CONF_INVERT): None,
|
vol.Optional(CONF_INVERT): None,
|
||||||
vol.Optional(CONF_DELAYED_ON): cv.positive_time_period_milliseconds,
|
vol.Optional(CONF_DELAYED_ON): cv.positive_time_period_milliseconds,
|
||||||
vol.Optional(CONF_DELAYED_OFF): cv.positive_time_period_milliseconds,
|
vol.Optional(CONF_DELAYED_OFF): cv.positive_time_period_milliseconds,
|
||||||
vol.Optional(CONF_HEARTBEAT): cv.positive_time_period_milliseconds,
|
vol.Optional(CONF_HEARTBEAT): cv.positive_time_period_milliseconds,
|
||||||
vol.Optional(CONF_LAMBDA): cv.lambda_,
|
vol.Optional(CONF_LAMBDA): cv.lambda_,
|
||||||
}, cv.has_exactly_one_key(*FILTER_KEYS))])
|
}, cv.has_exactly_one_key(*FILTER_KEYS))
|
||||||
|
|
||||||
MULTI_CLICK_TIMING_SCHEMA = vol.Schema({
|
MULTI_CLICK_TIMING_SCHEMA = vol.Schema({
|
||||||
vol.Optional(CONF_STATE): cv.boolean,
|
vol.Optional(CONF_STATE): cv.boolean,
|
||||||
|
@ -62,7 +72,7 @@ MULTI_CLICK_TIMING_SCHEMA = vol.Schema({
|
||||||
|
|
||||||
|
|
||||||
def parse_multi_click_timing_str(value):
|
def parse_multi_click_timing_str(value):
|
||||||
if not isinstance(value, basestring):
|
if not isinstance(value, string_types):
|
||||||
return value
|
return value
|
||||||
|
|
||||||
parts = value.lower().split(' ')
|
parts = value.lower().split(' ')
|
||||||
|
@ -150,7 +160,7 @@ def validate_multi_click_timing(value):
|
||||||
BINARY_SENSOR_SCHEMA = cv.MQTT_COMPONENT_SCHEMA.extend({
|
BINARY_SENSOR_SCHEMA = cv.MQTT_COMPONENT_SCHEMA.extend({
|
||||||
cv.GenerateID(CONF_MQTT_ID): cv.declare_variable_id(MQTTBinarySensorComponent),
|
cv.GenerateID(CONF_MQTT_ID): cv.declare_variable_id(MQTTBinarySensorComponent),
|
||||||
|
|
||||||
vol.Optional(CONF_DEVICE_CLASS): vol.All(vol.Lower, cv.one_of(*DEVICE_CLASSES)),
|
vol.Optional(CONF_DEVICE_CLASS): cv.one_of(*DEVICE_CLASSES, lower=True),
|
||||||
vol.Optional(CONF_FILTERS): FILTERS_SCHEMA,
|
vol.Optional(CONF_FILTERS): FILTERS_SCHEMA,
|
||||||
vol.Optional(CONF_ON_PRESS): automation.validate_automation({
|
vol.Optional(CONF_ON_PRESS): automation.validate_automation({
|
||||||
cv.GenerateID(CONF_TRIGGER_ID): cv.declare_variable_id(PressTrigger),
|
cv.GenerateID(CONF_TRIGGER_ID): cv.declare_variable_id(PressTrigger),
|
||||||
|
@ -174,6 +184,9 @@ BINARY_SENSOR_SCHEMA = cv.MQTT_COMPONENT_SCHEMA.extend({
|
||||||
validate_multi_click_timing),
|
validate_multi_click_timing),
|
||||||
vol.Optional(CONF_INVALID_COOLDOWN): cv.positive_time_period_milliseconds,
|
vol.Optional(CONF_INVALID_COOLDOWN): cv.positive_time_period_milliseconds,
|
||||||
}),
|
}),
|
||||||
|
vol.Optional(CONF_ON_STATE): automation.validate_automation({
|
||||||
|
cv.GenerateID(CONF_TRIGGER_ID): cv.declare_variable_id(StateTrigger),
|
||||||
|
}),
|
||||||
|
|
||||||
vol.Optional(CONF_INVERTED): cv.invalid(
|
vol.Optional(CONF_INVERTED): cv.invalid(
|
||||||
"The inverted binary_sensor property has been replaced by the "
|
"The inverted binary_sensor property has been replaced by the "
|
||||||
|
@ -195,8 +208,8 @@ def setup_filter(config):
|
||||||
elif CONF_HEARTBEAT in config:
|
elif CONF_HEARTBEAT in config:
|
||||||
yield App.register_component(HeartbeatFilter.new(config[CONF_HEARTBEAT]))
|
yield App.register_component(HeartbeatFilter.new(config[CONF_HEARTBEAT]))
|
||||||
elif CONF_LAMBDA in config:
|
elif CONF_LAMBDA in config:
|
||||||
lambda_ = None
|
for lambda_ in process_lambda(config[CONF_LAMBDA], [(bool_, 'x')],
|
||||||
for lambda_ in process_lambda(config[CONF_LAMBDA], [(bool_, 'x')]):
|
return_type=optional.template(bool_)):
|
||||||
yield None
|
yield None
|
||||||
yield LambdaFilter.new(lambda_)
|
yield LambdaFilter.new(lambda_)
|
||||||
|
|
||||||
|
@ -261,6 +274,11 @@ def setup_binary_sensor_core_(binary_sensor_var, mqtt_var, config):
|
||||||
add(trigger.set_invalid_cooldown(conf[CONF_INVALID_COOLDOWN]))
|
add(trigger.set_invalid_cooldown(conf[CONF_INVALID_COOLDOWN]))
|
||||||
automation.build_automation(trigger, NoArg, conf)
|
automation.build_automation(trigger, NoArg, conf)
|
||||||
|
|
||||||
|
for conf in config.get(CONF_ON_STATE, []):
|
||||||
|
rhs = binary_sensor_var.make_state_trigger()
|
||||||
|
trigger = Pvariable(conf[CONF_TRIGGER_ID], rhs)
|
||||||
|
automation.build_automation(trigger, bool_, conf)
|
||||||
|
|
||||||
setup_mqtt_component(mqtt_var, config)
|
setup_mqtt_component(mqtt_var, config)
|
||||||
|
|
||||||
|
|
||||||
|
@ -269,14 +287,14 @@ def setup_binary_sensor(binary_sensor_obj, mqtt_obj, config):
|
||||||
has_side_effects=False)
|
has_side_effects=False)
|
||||||
mqtt_var = Pvariable(config[CONF_MQTT_ID], mqtt_obj,
|
mqtt_var = Pvariable(config[CONF_MQTT_ID], mqtt_obj,
|
||||||
has_side_effects=False)
|
has_side_effects=False)
|
||||||
add_job(setup_binary_sensor_core_, binary_sensor_var, mqtt_var, config)
|
CORE.add_job(setup_binary_sensor_core_, binary_sensor_var, mqtt_var, config)
|
||||||
|
|
||||||
|
|
||||||
def register_binary_sensor(var, config):
|
def register_binary_sensor(var, config):
|
||||||
binary_sensor_var = Pvariable(config[CONF_ID], var, has_side_effects=True)
|
binary_sensor_var = Pvariable(config[CONF_ID], var, has_side_effects=True)
|
||||||
rhs = App.register_binary_sensor(binary_sensor_var)
|
rhs = App.register_binary_sensor(binary_sensor_var)
|
||||||
mqtt_var = Pvariable(config[CONF_MQTT_ID], rhs, has_side_effects=True)
|
mqtt_var = Pvariable(config[CONF_MQTT_ID], rhs, has_side_effects=True)
|
||||||
add_job(setup_binary_sensor_core_, binary_sensor_var, mqtt_var, config)
|
CORE.add_job(setup_binary_sensor_core_, binary_sensor_var, mqtt_var, config)
|
||||||
|
|
||||||
|
|
||||||
def core_to_hass_config(data, config):
|
def core_to_hass_config(data, config):
|
||||||
|
@ -290,3 +308,33 @@ def core_to_hass_config(data, config):
|
||||||
|
|
||||||
|
|
||||||
BUILD_FLAGS = '-DUSE_BINARY_SENSOR'
|
BUILD_FLAGS = '-DUSE_BINARY_SENSOR'
|
||||||
|
|
||||||
|
|
||||||
|
CONF_BINARY_SENSOR_IS_ON = 'binary_sensor.is_on'
|
||||||
|
BINARY_SENSOR_IS_ON_CONDITION_SCHEMA = maybe_simple_id({
|
||||||
|
vol.Required(CONF_ID): cv.use_variable_id(BinarySensor),
|
||||||
|
})
|
||||||
|
|
||||||
|
|
||||||
|
@CONDITION_REGISTRY.register(CONF_BINARY_SENSOR_IS_ON, BINARY_SENSOR_IS_ON_CONDITION_SCHEMA)
|
||||||
|
def binary_sensor_is_on_to_code(config, condition_id, arg_type, template_arg):
|
||||||
|
for var in get_variable(config[CONF_ID]):
|
||||||
|
yield None
|
||||||
|
rhs = var.make_binary_sensor_is_on_condition(template_arg)
|
||||||
|
type = BinarySensorCondition.template(arg_type)
|
||||||
|
yield Pvariable(condition_id, rhs, type=type)
|
||||||
|
|
||||||
|
|
||||||
|
CONF_BINARY_SENSOR_IS_OFF = 'binary_sensor.is_off'
|
||||||
|
BINARY_SENSOR_IS_OFF_CONDITION_SCHEMA = maybe_simple_id({
|
||||||
|
vol.Required(CONF_ID): cv.use_variable_id(BinarySensor),
|
||||||
|
})
|
||||||
|
|
||||||
|
|
||||||
|
@CONDITION_REGISTRY.register(CONF_BINARY_SENSOR_IS_OFF, BINARY_SENSOR_IS_OFF_CONDITION_SCHEMA)
|
||||||
|
def binary_sensor_is_off_to_code(config, condition_id, arg_type, template_arg):
|
||||||
|
for var in get_variable(config[CONF_ID]):
|
||||||
|
yield None
|
||||||
|
rhs = var.make_binary_sensor_is_off_condition(template_arg)
|
||||||
|
type = BinarySensorCondition.template(arg_type)
|
||||||
|
yield Pvariable(condition_id, rhs, type=type)
|
||||||
|
|
36
esphomeyaml/components/binary_sensor/apds9960.py
Normal file
|
@ -0,0 +1,36 @@
|
||||||
|
import voluptuous as vol
|
||||||
|
|
||||||
|
from esphomeyaml.components import binary_sensor, sensor
|
||||||
|
from esphomeyaml.components.apds9960 import APDS9960, CONF_APDS9960_ID
|
||||||
|
import esphomeyaml.config_validation as cv
|
||||||
|
from esphomeyaml.const import CONF_DIRECTION, CONF_NAME
|
||||||
|
from esphomeyaml.cpp_generator import get_variable
|
||||||
|
|
||||||
|
DEPENDENCIES = ['apds9960']
|
||||||
|
APDS9960GestureDirectionBinarySensor = sensor.sensor_ns.class_(
|
||||||
|
'APDS9960GestureDirectionBinarySensor', binary_sensor.BinarySensor)
|
||||||
|
|
||||||
|
DIRECTIONS = {
|
||||||
|
'UP': 'make_up_direction',
|
||||||
|
'DOWN': 'make_down_direction',
|
||||||
|
'LEFT': 'make_left_direction',
|
||||||
|
'RIGHT': 'make_right_direction',
|
||||||
|
}
|
||||||
|
|
||||||
|
PLATFORM_SCHEMA = cv.nameable(binary_sensor.BINARY_SENSOR_PLATFORM_SCHEMA.extend({
|
||||||
|
cv.GenerateID(): cv.declare_variable_id(APDS9960GestureDirectionBinarySensor),
|
||||||
|
vol.Required(CONF_DIRECTION): cv.one_of(*DIRECTIONS, upper=True),
|
||||||
|
cv.GenerateID(CONF_APDS9960_ID): cv.use_variable_id(APDS9960)
|
||||||
|
}))
|
||||||
|
|
||||||
|
|
||||||
|
def to_code(config):
|
||||||
|
for hub in get_variable(config[CONF_APDS9960_ID]):
|
||||||
|
yield
|
||||||
|
func = getattr(hub, DIRECTIONS[config[CONF_DIRECTION]])
|
||||||
|
rhs = func(config[CONF_NAME])
|
||||||
|
binary_sensor.register_binary_sensor(rhs, config)
|
||||||
|
|
||||||
|
|
||||||
|
def to_hass_config(data, config):
|
||||||
|
return binary_sensor.core_to_hass_config(data, config)
|
37
esphomeyaml/components/binary_sensor/custom.py
Normal file
|
@ -0,0 +1,37 @@
|
||||||
|
import voluptuous as vol
|
||||||
|
|
||||||
|
from esphomeyaml.components import binary_sensor
|
||||||
|
import esphomeyaml.config_validation as cv
|
||||||
|
from esphomeyaml.const import CONF_BINARY_SENSORS, CONF_ID, CONF_LAMBDA
|
||||||
|
from esphomeyaml.cpp_generator import process_lambda, variable
|
||||||
|
from esphomeyaml.cpp_types import std_vector
|
||||||
|
|
||||||
|
CustomBinarySensorConstructor = binary_sensor.binary_sensor_ns.class_(
|
||||||
|
'CustomBinarySensorConstructor')
|
||||||
|
|
||||||
|
PLATFORM_SCHEMA = binary_sensor.PLATFORM_SCHEMA.extend({
|
||||||
|
cv.GenerateID(): cv.declare_variable_id(CustomBinarySensorConstructor),
|
||||||
|
vol.Required(CONF_LAMBDA): cv.lambda_,
|
||||||
|
vol.Required(CONF_BINARY_SENSORS):
|
||||||
|
cv.ensure_list(binary_sensor.BINARY_SENSOR_SCHEMA.extend({
|
||||||
|
cv.GenerateID(): cv.declare_variable_id(binary_sensor.BinarySensor),
|
||||||
|
})),
|
||||||
|
})
|
||||||
|
|
||||||
|
|
||||||
|
def to_code(config):
|
||||||
|
for template_ in process_lambda(config[CONF_LAMBDA], [],
|
||||||
|
return_type=std_vector.template(binary_sensor.BinarySensorPtr)):
|
||||||
|
yield
|
||||||
|
|
||||||
|
rhs = CustomBinarySensorConstructor(template_)
|
||||||
|
custom = variable(config[CONF_ID], rhs)
|
||||||
|
for i, sens in enumerate(config[CONF_BINARY_SENSORS]):
|
||||||
|
binary_sensor.register_binary_sensor(custom.get_binary_sensor(i), sens)
|
||||||
|
|
||||||
|
|
||||||
|
BUILD_FLAGS = '-DUSE_CUSTOM_BINARY_SENSOR'
|
||||||
|
|
||||||
|
|
||||||
|
def to_hass_config(data, config):
|
||||||
|
return [binary_sensor.core_to_hass_config(data, sens) for sens in config[CONF_BINARY_SENSORS]]
|
|
@ -5,7 +5,8 @@ from esphomeyaml.components.esp32_ble_tracker import CONF_ESP32_BLE_ID, ESP32BLE
|
||||||
make_address_array
|
make_address_array
|
||||||
import esphomeyaml.config_validation as cv
|
import esphomeyaml.config_validation as cv
|
||||||
from esphomeyaml.const import CONF_MAC_ADDRESS, CONF_NAME
|
from esphomeyaml.const import CONF_MAC_ADDRESS, CONF_NAME
|
||||||
from esphomeyaml.helpers import esphomelib_ns, get_variable
|
from esphomeyaml.cpp_generator import get_variable
|
||||||
|
from esphomeyaml.cpp_types import esphomelib_ns
|
||||||
|
|
||||||
DEPENDENCIES = ['esp32_ble_tracker']
|
DEPENDENCIES = ['esp32_ble_tracker']
|
||||||
ESP32BLEPresenceDevice = esphomelib_ns.class_('ESP32BLEPresenceDevice', binary_sensor.BinarySensor)
|
ESP32BLEPresenceDevice = esphomelib_ns.class_('ESP32BLEPresenceDevice', binary_sensor.BinarySensor)
|
||||||
|
|
|
@ -4,7 +4,8 @@ import esphomeyaml.config_validation as cv
|
||||||
from esphomeyaml.components import binary_sensor
|
from esphomeyaml.components import binary_sensor
|
||||||
from esphomeyaml.components.esp32_touch import ESP32TouchComponent
|
from esphomeyaml.components.esp32_touch import ESP32TouchComponent
|
||||||
from esphomeyaml.const import CONF_NAME, CONF_PIN, CONF_THRESHOLD, ESP_PLATFORM_ESP32
|
from esphomeyaml.const import CONF_NAME, CONF_PIN, CONF_THRESHOLD, ESP_PLATFORM_ESP32
|
||||||
from esphomeyaml.helpers import get_variable, global_ns
|
from esphomeyaml.cpp_generator import get_variable
|
||||||
|
from esphomeyaml.cpp_types import global_ns
|
||||||
from esphomeyaml.pins import validate_gpio_pin
|
from esphomeyaml.pins import validate_gpio_pin
|
||||||
|
|
||||||
ESP_PLATFORMS = [ESP_PLATFORM_ESP32]
|
ESP_PLATFORMS = [ESP_PLATFORM_ESP32]
|
||||||
|
|
|
@ -4,8 +4,9 @@ import esphomeyaml.config_validation as cv
|
||||||
from esphomeyaml import pins
|
from esphomeyaml import pins
|
||||||
from esphomeyaml.components import binary_sensor
|
from esphomeyaml.components import binary_sensor
|
||||||
from esphomeyaml.const import CONF_MAKE_ID, CONF_NAME, CONF_PIN
|
from esphomeyaml.const import CONF_MAKE_ID, CONF_NAME, CONF_PIN
|
||||||
from esphomeyaml.helpers import App, gpio_input_pin_expression, variable, Application, \
|
from esphomeyaml.cpp_generator import variable
|
||||||
setup_component, Component
|
from esphomeyaml.cpp_helpers import gpio_input_pin_expression, setup_component
|
||||||
|
from esphomeyaml.cpp_types import Application, Component, App
|
||||||
|
|
||||||
MakeGPIOBinarySensor = Application.struct('MakeGPIOBinarySensor')
|
MakeGPIOBinarySensor = Application.struct('MakeGPIOBinarySensor')
|
||||||
GPIOBinarySensorComponent = binary_sensor.binary_sensor_ns.class_('GPIOBinarySensorComponent',
|
GPIOBinarySensorComponent = binary_sensor.binary_sensor_ns.class_('GPIOBinarySensorComponent',
|
||||||
|
|
|
@ -4,7 +4,7 @@ from esphomeyaml.components import binary_sensor, display
|
||||||
from esphomeyaml.components.display.nextion import Nextion
|
from esphomeyaml.components.display.nextion import Nextion
|
||||||
import esphomeyaml.config_validation as cv
|
import esphomeyaml.config_validation as cv
|
||||||
from esphomeyaml.const import CONF_COMPONENT_ID, CONF_NAME, CONF_PAGE_ID
|
from esphomeyaml.const import CONF_COMPONENT_ID, CONF_NAME, CONF_PAGE_ID
|
||||||
from esphomeyaml.helpers import get_variable
|
from esphomeyaml.cpp_generator import get_variable
|
||||||
|
|
||||||
DEPENDENCIES = ['display']
|
DEPENDENCIES = ['display']
|
||||||
|
|
||||||
|
@ -22,7 +22,6 @@ PLATFORM_SCHEMA = cv.nameable(binary_sensor.BINARY_SENSOR_PLATFORM_SCHEMA.extend
|
||||||
|
|
||||||
|
|
||||||
def to_code(config):
|
def to_code(config):
|
||||||
hub = None
|
|
||||||
for hub in get_variable(config[CONF_NEXTION_ID]):
|
for hub in get_variable(config[CONF_NEXTION_ID]):
|
||||||
yield
|
yield
|
||||||
rhs = hub.make_touch_component(config[CONF_NAME], config[CONF_PAGE_ID],
|
rhs = hub.make_touch_component(config[CONF_NAME], config[CONF_PAGE_ID],
|
||||||
|
|
|
@ -5,7 +5,7 @@ from esphomeyaml.components import binary_sensor
|
||||||
from esphomeyaml.components.pn532 import PN532Component
|
from esphomeyaml.components.pn532 import PN532Component
|
||||||
from esphomeyaml.const import CONF_NAME, CONF_UID
|
from esphomeyaml.const import CONF_NAME, CONF_UID
|
||||||
from esphomeyaml.core import HexInt
|
from esphomeyaml.core import HexInt
|
||||||
from esphomeyaml.helpers import ArrayInitializer, get_variable
|
from esphomeyaml.cpp_generator import get_variable, ArrayInitializer
|
||||||
|
|
||||||
DEPENDENCIES = ['pn532']
|
DEPENDENCIES = ['pn532']
|
||||||
|
|
||||||
|
|
|
@ -3,7 +3,7 @@ import voluptuous as vol
|
||||||
import esphomeyaml.config_validation as cv
|
import esphomeyaml.config_validation as cv
|
||||||
from esphomeyaml.components import binary_sensor, rdm6300
|
from esphomeyaml.components import binary_sensor, rdm6300
|
||||||
from esphomeyaml.const import CONF_NAME, CONF_UID
|
from esphomeyaml.const import CONF_NAME, CONF_UID
|
||||||
from esphomeyaml.helpers import get_variable
|
from esphomeyaml.cpp_generator import get_variable
|
||||||
|
|
||||||
DEPENDENCIES = ['rdm6300']
|
DEPENDENCIES = ['rdm6300']
|
||||||
|
|
||||||
|
|
|
@ -11,7 +11,7 @@ from esphomeyaml.const import CONF_ADDRESS, CONF_CHANNEL, CONF_CODE, CONF_COMMAN
|
||||||
CONF_PANASONIC, CONF_PROTOCOL, CONF_RAW, CONF_RC_SWITCH_RAW, CONF_RC_SWITCH_TYPE_A, \
|
CONF_PANASONIC, CONF_PROTOCOL, CONF_RAW, CONF_RC_SWITCH_RAW, CONF_RC_SWITCH_TYPE_A, \
|
||||||
CONF_RC_SWITCH_TYPE_B, CONF_RC_SWITCH_TYPE_C, CONF_RC_SWITCH_TYPE_D, CONF_SAMSUNG, CONF_SONY, \
|
CONF_RC_SWITCH_TYPE_B, CONF_RC_SWITCH_TYPE_C, CONF_RC_SWITCH_TYPE_D, CONF_SAMSUNG, CONF_SONY, \
|
||||||
CONF_STATE
|
CONF_STATE
|
||||||
from esphomeyaml.helpers import ArrayInitializer, Pvariable, get_variable
|
from esphomeyaml.cpp_generator import ArrayInitializer, get_variable, Pvariable
|
||||||
|
|
||||||
DEPENDENCIES = ['remote_receiver']
|
DEPENDENCIES = ['remote_receiver']
|
||||||
|
|
||||||
|
@ -39,7 +39,7 @@ PLATFORM_SCHEMA = cv.nameable(binary_sensor.BINARY_SENSOR_PLATFORM_SCHEMA.extend
|
||||||
cv.GenerateID(): cv.declare_variable_id(RemoteReceiver),
|
cv.GenerateID(): cv.declare_variable_id(RemoteReceiver),
|
||||||
vol.Optional(CONF_LG): vol.Schema({
|
vol.Optional(CONF_LG): vol.Schema({
|
||||||
vol.Required(CONF_DATA): cv.hex_uint32_t,
|
vol.Required(CONF_DATA): cv.hex_uint32_t,
|
||||||
vol.Optional(CONF_NBITS, default=28): vol.All(vol.Coerce(int), cv.one_of(28, 32)),
|
vol.Optional(CONF_NBITS, default=28): cv.one_of(28, 32, int=True),
|
||||||
}),
|
}),
|
||||||
vol.Optional(CONF_NEC): vol.Schema({
|
vol.Optional(CONF_NEC): vol.Schema({
|
||||||
vol.Required(CONF_ADDRESS): cv.hex_uint16_t,
|
vol.Required(CONF_ADDRESS): cv.hex_uint16_t,
|
||||||
|
@ -50,7 +50,7 @@ PLATFORM_SCHEMA = cv.nameable(binary_sensor.BINARY_SENSOR_PLATFORM_SCHEMA.extend
|
||||||
}),
|
}),
|
||||||
vol.Optional(CONF_SONY): vol.Schema({
|
vol.Optional(CONF_SONY): vol.Schema({
|
||||||
vol.Required(CONF_DATA): cv.hex_uint32_t,
|
vol.Required(CONF_DATA): cv.hex_uint32_t,
|
||||||
vol.Optional(CONF_NBITS, default=12): vol.All(vol.Coerce(int), cv.one_of(12, 15, 20)),
|
vol.Optional(CONF_NBITS, default=12): cv.one_of(12, 15, 20, int=True),
|
||||||
}),
|
}),
|
||||||
vol.Optional(CONF_PANASONIC): vol.Schema({
|
vol.Optional(CONF_PANASONIC): vol.Schema({
|
||||||
vol.Required(CONF_ADDRESS): cv.hex_uint16_t,
|
vol.Required(CONF_ADDRESS): cv.hex_uint16_t,
|
||||||
|
@ -73,39 +73,39 @@ def receiver_base(full_config):
|
||||||
key, config = next((k, v) for k, v in full_config.items() if k in REMOTE_KEYS)
|
key, config = next((k, v) for k, v in full_config.items() if k in REMOTE_KEYS)
|
||||||
if key == CONF_LG:
|
if key == CONF_LG:
|
||||||
return LGReceiver.new(name, config[CONF_DATA], config[CONF_NBITS])
|
return LGReceiver.new(name, config[CONF_DATA], config[CONF_NBITS])
|
||||||
elif key == CONF_NEC:
|
if key == CONF_NEC:
|
||||||
return NECReceiver.new(name, config[CONF_ADDRESS], config[CONF_COMMAND])
|
return NECReceiver.new(name, config[CONF_ADDRESS], config[CONF_COMMAND])
|
||||||
elif key == CONF_PANASONIC:
|
if key == CONF_PANASONIC:
|
||||||
return PanasonicReceiver.new(name, config[CONF_ADDRESS], config[CONF_COMMAND])
|
return PanasonicReceiver.new(name, config[CONF_ADDRESS], config[CONF_COMMAND])
|
||||||
elif key == CONF_SAMSUNG:
|
if key == CONF_SAMSUNG:
|
||||||
return SamsungReceiver.new(name, config[CONF_DATA])
|
return SamsungReceiver.new(name, config[CONF_DATA])
|
||||||
elif key == CONF_SONY:
|
if key == CONF_SONY:
|
||||||
return SonyReceiver.new(name, config[CONF_DATA], config[CONF_NBITS])
|
return SonyReceiver.new(name, config[CONF_DATA], config[CONF_NBITS])
|
||||||
elif key == CONF_RAW:
|
if key == CONF_RAW:
|
||||||
data = ArrayInitializer(*config, multiline=False)
|
data = ArrayInitializer(*config, multiline=False)
|
||||||
return RawReceiver.new(name, data)
|
return RawReceiver.new(name, data)
|
||||||
elif key == CONF_RC_SWITCH_RAW:
|
if key == CONF_RC_SWITCH_RAW:
|
||||||
return RCSwitchRawReceiver.new(name, build_rc_switch_protocol(config[CONF_PROTOCOL]),
|
return RCSwitchRawReceiver.new(name, build_rc_switch_protocol(config[CONF_PROTOCOL]),
|
||||||
binary_code(config[CONF_CODE]), len(config[CONF_CODE]))
|
binary_code(config[CONF_CODE]), len(config[CONF_CODE]))
|
||||||
elif key == CONF_RC_SWITCH_TYPE_A:
|
if key == CONF_RC_SWITCH_TYPE_A:
|
||||||
return RCSwitchTypeAReceiver.new(name, build_rc_switch_protocol(config[CONF_PROTOCOL]),
|
return RCSwitchTypeAReceiver.new(name, build_rc_switch_protocol(config[CONF_PROTOCOL]),
|
||||||
binary_code(config[CONF_GROUP]),
|
binary_code(config[CONF_GROUP]),
|
||||||
binary_code(config[CONF_DEVICE]),
|
binary_code(config[CONF_DEVICE]),
|
||||||
config[CONF_STATE])
|
config[CONF_STATE])
|
||||||
elif key == CONF_RC_SWITCH_TYPE_B:
|
if key == CONF_RC_SWITCH_TYPE_B:
|
||||||
return RCSwitchTypeBReceiver.new(name, build_rc_switch_protocol(config[CONF_PROTOCOL]),
|
return RCSwitchTypeBReceiver.new(name, build_rc_switch_protocol(config[CONF_PROTOCOL]),
|
||||||
config[CONF_ADDRESS], config[CONF_CHANNEL],
|
config[CONF_ADDRESS], config[CONF_CHANNEL],
|
||||||
config[CONF_STATE])
|
config[CONF_STATE])
|
||||||
elif key == CONF_RC_SWITCH_TYPE_C:
|
if key == CONF_RC_SWITCH_TYPE_C:
|
||||||
return RCSwitchTypeCReceiver.new(name, build_rc_switch_protocol(config[CONF_PROTOCOL]),
|
return RCSwitchTypeCReceiver.new(name, build_rc_switch_protocol(config[CONF_PROTOCOL]),
|
||||||
ord(config[CONF_FAMILY][0]) - ord('a'),
|
ord(config[CONF_FAMILY][0]) - ord('a'),
|
||||||
config[CONF_GROUP], config[CONF_DEVICE],
|
config[CONF_GROUP], config[CONF_DEVICE],
|
||||||
config[CONF_STATE])
|
config[CONF_STATE])
|
||||||
elif key == CONF_RC_SWITCH_TYPE_D:
|
if key == CONF_RC_SWITCH_TYPE_D:
|
||||||
return RCSwitchTypeDReceiver.new(name, build_rc_switch_protocol(config[CONF_PROTOCOL]),
|
return RCSwitchTypeDReceiver.new(name, build_rc_switch_protocol(config[CONF_PROTOCOL]),
|
||||||
ord(config[CONF_GROUP][0]) - ord('a'),
|
ord(config[CONF_GROUP][0]) - ord('a'),
|
||||||
config[CONF_DEVICE], config[CONF_STATE])
|
config[CONF_DEVICE], config[CONF_STATE])
|
||||||
else:
|
|
||||||
raise NotImplementedError("Unknown receiver type {}".format(config))
|
raise NotImplementedError("Unknown receiver type {}".format(config))
|
||||||
|
|
||||||
|
|
||||||
|
|
|
@ -1,9 +1,10 @@
|
||||||
import esphomeyaml.config_validation as cv
|
import esphomeyaml.config_validation as cv
|
||||||
from esphomeyaml.components import binary_sensor
|
from esphomeyaml.components import binary_sensor
|
||||||
from esphomeyaml.const import CONF_MAKE_ID, CONF_NAME
|
from esphomeyaml.const import CONF_MAKE_ID, CONF_NAME
|
||||||
from esphomeyaml.helpers import App, Application, variable, setup_component, Component
|
from esphomeyaml.cpp_generator import variable
|
||||||
|
from esphomeyaml.cpp_helpers import setup_component
|
||||||
|
from esphomeyaml.cpp_types import Application, Component, App
|
||||||
|
|
||||||
DEPENDENCIES = ['mqtt']
|
|
||||||
|
|
||||||
MakeStatusBinarySensor = Application.struct('MakeStatusBinarySensor')
|
MakeStatusBinarySensor = Application.struct('MakeStatusBinarySensor')
|
||||||
StatusBinarySensor = binary_sensor.binary_sensor_ns.class_('StatusBinarySensor',
|
StatusBinarySensor = binary_sensor.binary_sensor_ns.class_('StatusBinarySensor',
|
||||||
|
|
|
@ -3,8 +3,9 @@ import voluptuous as vol
|
||||||
from esphomeyaml.components import binary_sensor
|
from esphomeyaml.components import binary_sensor
|
||||||
import esphomeyaml.config_validation as cv
|
import esphomeyaml.config_validation as cv
|
||||||
from esphomeyaml.const import CONF_LAMBDA, CONF_MAKE_ID, CONF_NAME
|
from esphomeyaml.const import CONF_LAMBDA, CONF_MAKE_ID, CONF_NAME
|
||||||
from esphomeyaml.helpers import App, Application, add, bool_, optional, process_lambda, variable, \
|
from esphomeyaml.cpp_generator import variable, process_lambda, add
|
||||||
setup_component, Component
|
from esphomeyaml.cpp_helpers import setup_component
|
||||||
|
from esphomeyaml.cpp_types import Application, Component, App, optional, bool_
|
||||||
|
|
||||||
MakeTemplateBinarySensor = Application.struct('MakeTemplateBinarySensor')
|
MakeTemplateBinarySensor = Application.struct('MakeTemplateBinarySensor')
|
||||||
TemplateBinarySensor = binary_sensor.binary_sensor_ns.class_('TemplateBinarySensor',
|
TemplateBinarySensor = binary_sensor.binary_sensor_ns.class_('TemplateBinarySensor',
|
||||||
|
|
|
@ -1,11 +1,12 @@
|
||||||
import voluptuous as vol
|
import voluptuous as vol
|
||||||
|
|
||||||
from esphomeyaml.automation import maybe_simple_id, ACTION_REGISTRY
|
from esphomeyaml.automation import ACTION_REGISTRY, maybe_simple_id
|
||||||
from esphomeyaml.components import mqtt
|
from esphomeyaml.components import mqtt
|
||||||
|
from esphomeyaml.components.mqtt import setup_mqtt_component
|
||||||
import esphomeyaml.config_validation as cv
|
import esphomeyaml.config_validation as cv
|
||||||
from esphomeyaml.const import CONF_ID, CONF_MQTT_ID, CONF_INTERNAL
|
from esphomeyaml.const import CONF_ID, CONF_INTERNAL, CONF_MQTT_ID
|
||||||
from esphomeyaml.helpers import Pvariable, esphomelib_ns, setup_mqtt_component, add, \
|
from esphomeyaml.cpp_generator import Pvariable, add, get_variable
|
||||||
TemplateArguments, get_variable, Action, Nameable
|
from esphomeyaml.cpp_types import Action, Nameable, esphomelib_ns
|
||||||
|
|
||||||
PLATFORM_SCHEMA = cv.PLATFORM_SCHEMA.extend({
|
PLATFORM_SCHEMA = cv.PLATFORM_SCHEMA.extend({
|
||||||
|
|
||||||
|
@ -54,8 +55,7 @@ COVER_OPEN_ACTION_SCHEMA = maybe_simple_id({
|
||||||
|
|
||||||
|
|
||||||
@ACTION_REGISTRY.register(CONF_COVER_OPEN, COVER_OPEN_ACTION_SCHEMA)
|
@ACTION_REGISTRY.register(CONF_COVER_OPEN, COVER_OPEN_ACTION_SCHEMA)
|
||||||
def cover_open_to_code(config, action_id, arg_type):
|
def cover_open_to_code(config, action_id, arg_type, template_arg):
|
||||||
template_arg = TemplateArguments(arg_type)
|
|
||||||
for var in get_variable(config[CONF_ID]):
|
for var in get_variable(config[CONF_ID]):
|
||||||
yield None
|
yield None
|
||||||
rhs = var.make_open_action(template_arg)
|
rhs = var.make_open_action(template_arg)
|
||||||
|
@ -70,8 +70,7 @@ COVER_CLOSE_ACTION_SCHEMA = maybe_simple_id({
|
||||||
|
|
||||||
|
|
||||||
@ACTION_REGISTRY.register(CONF_COVER_CLOSE, COVER_CLOSE_ACTION_SCHEMA)
|
@ACTION_REGISTRY.register(CONF_COVER_CLOSE, COVER_CLOSE_ACTION_SCHEMA)
|
||||||
def cover_close_to_code(config, action_id, arg_type):
|
def cover_close_to_code(config, action_id, arg_type, template_arg):
|
||||||
template_arg = TemplateArguments(arg_type)
|
|
||||||
for var in get_variable(config[CONF_ID]):
|
for var in get_variable(config[CONF_ID]):
|
||||||
yield None
|
yield None
|
||||||
rhs = var.make_close_action(template_arg)
|
rhs = var.make_close_action(template_arg)
|
||||||
|
@ -86,8 +85,7 @@ COVER_STOP_ACTION_SCHEMA = maybe_simple_id({
|
||||||
|
|
||||||
|
|
||||||
@ACTION_REGISTRY.register(CONF_COVER_STOP, COVER_STOP_ACTION_SCHEMA)
|
@ACTION_REGISTRY.register(CONF_COVER_STOP, COVER_STOP_ACTION_SCHEMA)
|
||||||
def cover_stop_to_code(config, action_id, arg_type):
|
def cover_stop_to_code(config, action_id, arg_type, template_arg):
|
||||||
template_arg = TemplateArguments(arg_type)
|
|
||||||
for var in get_variable(config[CONF_ID]):
|
for var in get_variable(config[CONF_ID]):
|
||||||
yield None
|
yield None
|
||||||
rhs = var.make_stop_action(template_arg)
|
rhs = var.make_stop_action(template_arg)
|
||||||
|
|
|
@ -5,8 +5,9 @@ from esphomeyaml import automation
|
||||||
from esphomeyaml.components import cover
|
from esphomeyaml.components import cover
|
||||||
from esphomeyaml.const import CONF_CLOSE_ACTION, CONF_LAMBDA, CONF_MAKE_ID, CONF_NAME, \
|
from esphomeyaml.const import CONF_CLOSE_ACTION, CONF_LAMBDA, CONF_MAKE_ID, CONF_NAME, \
|
||||||
CONF_OPEN_ACTION, CONF_STOP_ACTION, CONF_OPTIMISTIC
|
CONF_OPEN_ACTION, CONF_STOP_ACTION, CONF_OPTIMISTIC
|
||||||
from esphomeyaml.helpers import App, Application, NoArg, add, process_lambda, variable, optional, \
|
from esphomeyaml.cpp_generator import variable, process_lambda, add
|
||||||
setup_component
|
from esphomeyaml.cpp_helpers import setup_component
|
||||||
|
from esphomeyaml.cpp_types import Application, App, optional, NoArg
|
||||||
|
|
||||||
MakeTemplateCover = Application.struct('MakeTemplateCover')
|
MakeTemplateCover = Application.struct('MakeTemplateCover')
|
||||||
TemplateCover = cover.cover_ns.class_('TemplateCover', cover.Cover)
|
TemplateCover = cover.cover_ns.class_('TemplateCover', cover.Cover)
|
||||||
|
|
32
esphomeyaml/components/custom_component.py
Normal file
|
@ -0,0 +1,32 @@
|
||||||
|
import voluptuous as vol
|
||||||
|
|
||||||
|
import esphomeyaml.config_validation as cv
|
||||||
|
from esphomeyaml.const import CONF_ID, CONF_LAMBDA, CONF_COMPONENTS
|
||||||
|
from esphomeyaml.cpp_generator import process_lambda, variable
|
||||||
|
from esphomeyaml.cpp_helpers import setup_component
|
||||||
|
from esphomeyaml.cpp_types import Component, ComponentPtr, esphomelib_ns, std_vector
|
||||||
|
|
||||||
|
CustomComponentConstructor = esphomelib_ns.class_('CustomComponentConstructor')
|
||||||
|
MULTI_CONF = True
|
||||||
|
|
||||||
|
CONFIG_SCHEMA = vol.Schema({
|
||||||
|
cv.GenerateID(): cv.declare_variable_id(CustomComponentConstructor),
|
||||||
|
vol.Required(CONF_LAMBDA): cv.lambda_,
|
||||||
|
vol.Optional(CONF_COMPONENTS): cv.ensure_list(vol.Schema({
|
||||||
|
cv.GenerateID(): cv.declare_variable_id(Component)
|
||||||
|
}).extend(cv.COMPONENT_SCHEMA.schema)),
|
||||||
|
})
|
||||||
|
|
||||||
|
|
||||||
|
def to_code(config):
|
||||||
|
for template_ in process_lambda(config[CONF_LAMBDA], [],
|
||||||
|
return_type=std_vector.template(ComponentPtr)):
|
||||||
|
yield
|
||||||
|
|
||||||
|
rhs = CustomComponentConstructor(template_)
|
||||||
|
custom = variable(config[CONF_ID], rhs)
|
||||||
|
for i, comp in enumerate(config.get(CONF_COMPONENTS, [])):
|
||||||
|
setup_component(custom.get_component(i), comp)
|
||||||
|
|
||||||
|
|
||||||
|
BUILD_FLAGS = '-DUSE_CUSTOM_COMPONENT'
|
|
@ -1,25 +1,27 @@
|
||||||
import voluptuous as vol
|
import voluptuous as vol
|
||||||
|
|
||||||
import esphomeyaml.config_validation as cv
|
|
||||||
from esphomeyaml import pins
|
from esphomeyaml import pins
|
||||||
from esphomeyaml.components import sensor
|
from esphomeyaml.components import sensor
|
||||||
|
import esphomeyaml.config_validation as cv
|
||||||
from esphomeyaml.const import CONF_ID, CONF_PIN, CONF_UPDATE_INTERVAL
|
from esphomeyaml.const import CONF_ID, CONF_PIN, CONF_UPDATE_INTERVAL
|
||||||
from esphomeyaml.helpers import App, Pvariable, setup_component, PollingComponent
|
from esphomeyaml.cpp_generator import Pvariable
|
||||||
|
from esphomeyaml.cpp_helpers import setup_component
|
||||||
|
from esphomeyaml.cpp_types import App, PollingComponent
|
||||||
|
|
||||||
DallasComponent = sensor.sensor_ns.class_('DallasComponent', PollingComponent)
|
DallasComponent = sensor.sensor_ns.class_('DallasComponent', PollingComponent)
|
||||||
|
MULTI_CONF = True
|
||||||
|
|
||||||
CONFIG_SCHEMA = vol.All(cv.ensure_list, [vol.Schema({
|
CONFIG_SCHEMA = vol.Schema({
|
||||||
cv.GenerateID(): cv.declare_variable_id(DallasComponent),
|
cv.GenerateID(): cv.declare_variable_id(DallasComponent),
|
||||||
vol.Required(CONF_PIN): pins.input_output_pin,
|
vol.Required(CONF_PIN): pins.input_pullup_pin,
|
||||||
vol.Optional(CONF_UPDATE_INTERVAL): cv.update_interval,
|
vol.Optional(CONF_UPDATE_INTERVAL): cv.update_interval,
|
||||||
}).extend(cv.COMPONENT_SCHEMA.schema)])
|
}).extend(cv.COMPONENT_SCHEMA.schema)
|
||||||
|
|
||||||
|
|
||||||
def to_code(config):
|
def to_code(config):
|
||||||
for conf in config:
|
rhs = App.make_dallas_component(config[CONF_PIN], config.get(CONF_UPDATE_INTERVAL))
|
||||||
rhs = App.make_dallas_component(conf[CONF_PIN], conf.get(CONF_UPDATE_INTERVAL))
|
var = Pvariable(config[CONF_ID], rhs)
|
||||||
var = Pvariable(conf[CONF_ID], rhs)
|
setup_component(var, config)
|
||||||
setup_component(var, conf)
|
|
||||||
|
|
||||||
|
|
||||||
BUILD_FLAGS = '-DUSE_DALLAS_SENSOR'
|
BUILD_FLAGS = '-DUSE_DALLAS_SENSOR'
|
||||||
|
|
|
@ -1,6 +1,7 @@
|
||||||
import voluptuous as vol
|
import voluptuous as vol
|
||||||
|
|
||||||
from esphomeyaml.helpers import App, add
|
from esphomeyaml.cpp_generator import add
|
||||||
|
from esphomeyaml.cpp_types import App
|
||||||
|
|
||||||
DEPENDENCIES = ['logger']
|
DEPENDENCIES = ['logger']
|
||||||
|
|
||||||
|
|
|
@ -2,11 +2,11 @@ import voluptuous as vol
|
||||||
|
|
||||||
from esphomeyaml import config_validation as cv, pins
|
from esphomeyaml import config_validation as cv, pins
|
||||||
from esphomeyaml.automation import ACTION_REGISTRY, maybe_simple_id
|
from esphomeyaml.automation import ACTION_REGISTRY, maybe_simple_id
|
||||||
from esphomeyaml.const import CONF_ID, CONF_NUMBER, CONF_RUN_CYCLES, CONF_RUN_DURATION, \
|
from esphomeyaml.const import CONF_ID, CONF_MODE, CONF_NUMBER, CONF_PINS, CONF_RUN_CYCLES, \
|
||||||
CONF_SLEEP_DURATION, CONF_WAKEUP_PIN, CONF_MODE, CONF_PINS
|
CONF_RUN_DURATION, CONF_SLEEP_DURATION, CONF_WAKEUP_PIN
|
||||||
from esphomeyaml.helpers import Action, App, Component, Pvariable, TemplateArguments, add, \
|
from esphomeyaml.cpp_generator import Pvariable, StructInitializer, add, get_variable
|
||||||
esphomelib_ns, get_variable, gpio_input_pin_expression, setup_component, global_ns, \
|
from esphomeyaml.cpp_helpers import gpio_input_pin_expression, setup_component
|
||||||
StructInitializer
|
from esphomeyaml.cpp_types import Action, App, Component, esphomelib_ns, global_ns
|
||||||
|
|
||||||
|
|
||||||
def validate_pin_number(value):
|
def validate_pin_number(value):
|
||||||
|
@ -43,12 +43,11 @@ CONFIG_SCHEMA = vol.Schema({
|
||||||
vol.Optional(CONF_SLEEP_DURATION): cv.positive_time_period_milliseconds,
|
vol.Optional(CONF_SLEEP_DURATION): cv.positive_time_period_milliseconds,
|
||||||
vol.Optional(CONF_WAKEUP_PIN): vol.All(cv.only_on_esp32, pins.internal_gpio_input_pin_schema,
|
vol.Optional(CONF_WAKEUP_PIN): vol.All(cv.only_on_esp32, pins.internal_gpio_input_pin_schema,
|
||||||
validate_pin_number),
|
validate_pin_number),
|
||||||
vol.Optional(CONF_WAKEUP_PIN_MODE): vol.All(cv.only_on_esp32, vol.Upper,
|
vol.Optional(CONF_WAKEUP_PIN_MODE): vol.All(cv.only_on_esp32,
|
||||||
cv.one_of(*WAKEUP_PIN_MODES)),
|
cv.one_of(*WAKEUP_PIN_MODES), upper=True),
|
||||||
vol.Optional(CONF_ESP32_EXT1_WAKEUP): vol.All(cv.only_on_esp32, vol.Schema({
|
vol.Optional(CONF_ESP32_EXT1_WAKEUP): vol.All(cv.only_on_esp32, vol.Schema({
|
||||||
vol.Required(CONF_PINS): vol.All(cv.ensure_list, [pins.shorthand_input_pin],
|
vol.Required(CONF_PINS): cv.ensure_list(pins.shorthand_input_pin, validate_pin_number),
|
||||||
[validate_pin_number]),
|
vol.Required(CONF_MODE): cv.one_of(*EXT1_WAKEUP_MODES, upper=True),
|
||||||
vol.Required(CONF_MODE): vol.All(vol.Upper, cv.one_of(*EXT1_WAKEUP_MODES)),
|
|
||||||
})),
|
})),
|
||||||
vol.Optional(CONF_RUN_CYCLES): cv.positive_int,
|
vol.Optional(CONF_RUN_CYCLES): cv.positive_int,
|
||||||
vol.Optional(CONF_RUN_DURATION): cv.positive_time_period_milliseconds,
|
vol.Optional(CONF_RUN_DURATION): cv.positive_time_period_milliseconds,
|
||||||
|
@ -95,8 +94,7 @@ DEEP_SLEEP_ENTER_ACTION_SCHEMA = maybe_simple_id({
|
||||||
|
|
||||||
|
|
||||||
@ACTION_REGISTRY.register(CONF_DEEP_SLEEP_ENTER, DEEP_SLEEP_ENTER_ACTION_SCHEMA)
|
@ACTION_REGISTRY.register(CONF_DEEP_SLEEP_ENTER, DEEP_SLEEP_ENTER_ACTION_SCHEMA)
|
||||||
def deep_sleep_enter_to_code(config, action_id, arg_type):
|
def deep_sleep_enter_to_code(config, action_id, arg_type, template_arg):
|
||||||
template_arg = TemplateArguments(arg_type)
|
|
||||||
for var in get_variable(config[CONF_ID]):
|
for var in get_variable(config[CONF_ID]):
|
||||||
yield None
|
yield None
|
||||||
rhs = var.make_enter_deep_sleep_action(template_arg)
|
rhs = var.make_enter_deep_sleep_action(template_arg)
|
||||||
|
@ -111,8 +109,7 @@ DEEP_SLEEP_PREVENT_ACTION_SCHEMA = maybe_simple_id({
|
||||||
|
|
||||||
|
|
||||||
@ACTION_REGISTRY.register(CONF_DEEP_SLEEP_PREVENT, DEEP_SLEEP_PREVENT_ACTION_SCHEMA)
|
@ACTION_REGISTRY.register(CONF_DEEP_SLEEP_PREVENT, DEEP_SLEEP_PREVENT_ACTION_SCHEMA)
|
||||||
def deep_sleep_prevent_to_code(config, action_id, arg_type):
|
def deep_sleep_prevent_to_code(config, action_id, arg_type, template_arg):
|
||||||
template_arg = TemplateArguments(arg_type)
|
|
||||||
for var in get_variable(config[CONF_ID]):
|
for var in get_variable(config[CONF_ID]):
|
||||||
yield None
|
yield None
|
||||||
rhs = var.make_prevent_deep_sleep_action(template_arg)
|
rhs = var.make_prevent_deep_sleep_action(template_arg)
|
||||||
|
|
|
@ -3,7 +3,9 @@ import voluptuous as vol
|
||||||
|
|
||||||
import esphomeyaml.config_validation as cv
|
import esphomeyaml.config_validation as cv
|
||||||
from esphomeyaml.const import CONF_LAMBDA, CONF_ROTATION, CONF_UPDATE_INTERVAL
|
from esphomeyaml.const import CONF_LAMBDA, CONF_ROTATION, CONF_UPDATE_INTERVAL
|
||||||
from esphomeyaml.helpers import add, add_job, esphomelib_ns
|
from esphomeyaml.core import CORE
|
||||||
|
from esphomeyaml.cpp_generator import add
|
||||||
|
from esphomeyaml.cpp_types import esphomelib_ns
|
||||||
|
|
||||||
PLATFORM_SCHEMA = cv.PLATFORM_SCHEMA.extend({
|
PLATFORM_SCHEMA = cv.PLATFORM_SCHEMA.extend({
|
||||||
|
|
||||||
|
@ -50,7 +52,7 @@ def setup_display_core_(display_var, config):
|
||||||
|
|
||||||
|
|
||||||
def setup_display(display_var, config):
|
def setup_display(display_var, config):
|
||||||
add_job(setup_display_core_, display_var, config)
|
CORE.add_job(setup_display_core_, display_var, config)
|
||||||
|
|
||||||
|
|
||||||
BUILD_FLAGS = '-DUSE_DISPLAY'
|
BUILD_FLAGS = '-DUSE_DISPLAY'
|
||||||
|
|
|
@ -1,12 +1,13 @@
|
||||||
import voluptuous as vol
|
import voluptuous as vol
|
||||||
|
|
||||||
import esphomeyaml.config_validation as cv
|
|
||||||
from esphomeyaml import pins
|
from esphomeyaml import pins
|
||||||
from esphomeyaml.components import display
|
from esphomeyaml.components import display
|
||||||
|
import esphomeyaml.config_validation as cv
|
||||||
from esphomeyaml.const import CONF_DATA_PINS, CONF_DIMENSIONS, CONF_ENABLE_PIN, CONF_ID, \
|
from esphomeyaml.const import CONF_DATA_PINS, CONF_DIMENSIONS, CONF_ENABLE_PIN, CONF_ID, \
|
||||||
CONF_LAMBDA, CONF_RS_PIN, CONF_RW_PIN
|
CONF_LAMBDA, CONF_RS_PIN, CONF_RW_PIN
|
||||||
from esphomeyaml.helpers import App, Pvariable, add, gpio_output_pin_expression, process_lambda, \
|
from esphomeyaml.cpp_generator import Pvariable, add, process_lambda
|
||||||
setup_component, PollingComponent
|
from esphomeyaml.cpp_helpers import gpio_output_pin_expression, setup_component
|
||||||
|
from esphomeyaml.cpp_types import App, PollingComponent, void
|
||||||
|
|
||||||
LCDDisplay = display.display_ns.class_('LCDDisplay', PollingComponent)
|
LCDDisplay = display.display_ns.class_('LCDDisplay', PollingComponent)
|
||||||
LCDDisplayRef = LCDDisplay.operator('ref')
|
LCDDisplayRef = LCDDisplay.operator('ref')
|
||||||
|
@ -63,7 +64,8 @@ def to_code(config):
|
||||||
add(lcd.set_rw_pin(rw))
|
add(lcd.set_rw_pin(rw))
|
||||||
|
|
||||||
if CONF_LAMBDA in config:
|
if CONF_LAMBDA in config:
|
||||||
for lambda_ in process_lambda(config[CONF_LAMBDA], [(LCDDisplayRef, 'it')]):
|
for lambda_ in process_lambda(config[CONF_LAMBDA], [(LCDDisplayRef, 'it')],
|
||||||
|
return_type=void):
|
||||||
yield
|
yield
|
||||||
add(lcd.set_writer(lambda_))
|
add(lcd.set_writer(lambda_))
|
||||||
|
|
||||||
|
|
|
@ -1,11 +1,13 @@
|
||||||
import voluptuous as vol
|
import voluptuous as vol
|
||||||
|
|
||||||
import esphomeyaml.config_validation as cv
|
|
||||||
from esphomeyaml.components import display, i2c
|
from esphomeyaml.components import display, i2c
|
||||||
from esphomeyaml.components.display.lcd_gpio import LCDDisplayRef, validate_lcd_dimensions, \
|
from esphomeyaml.components.display.lcd_gpio import LCDDisplay, LCDDisplayRef, \
|
||||||
LCDDisplay
|
validate_lcd_dimensions
|
||||||
|
import esphomeyaml.config_validation as cv
|
||||||
from esphomeyaml.const import CONF_ADDRESS, CONF_DIMENSIONS, CONF_ID, CONF_LAMBDA
|
from esphomeyaml.const import CONF_ADDRESS, CONF_DIMENSIONS, CONF_ID, CONF_LAMBDA
|
||||||
from esphomeyaml.helpers import App, Pvariable, add, process_lambda, setup_component
|
from esphomeyaml.cpp_generator import Pvariable, add, process_lambda
|
||||||
|
from esphomeyaml.cpp_helpers import setup_component
|
||||||
|
from esphomeyaml.cpp_types import App, void
|
||||||
|
|
||||||
DEPENDENCIES = ['i2c']
|
DEPENDENCIES = ['i2c']
|
||||||
|
|
||||||
|
@ -26,7 +28,8 @@ def to_code(config):
|
||||||
add(lcd.set_address(config[CONF_ADDRESS]))
|
add(lcd.set_address(config[CONF_ADDRESS]))
|
||||||
|
|
||||||
if CONF_LAMBDA in config:
|
if CONF_LAMBDA in config:
|
||||||
for lambda_ in process_lambda(config[CONF_LAMBDA], [(LCDDisplayRef, 'it')]):
|
for lambda_ in process_lambda(config[CONF_LAMBDA], [(LCDDisplayRef, 'it')],
|
||||||
|
return_type=void):
|
||||||
yield
|
yield
|
||||||
add(lcd.set_writer(lambda_))
|
add(lcd.set_writer(lambda_))
|
||||||
|
|
||||||
|
|
|
@ -1,13 +1,14 @@
|
||||||
import voluptuous as vol
|
import voluptuous as vol
|
||||||
|
|
||||||
import esphomeyaml.config_validation as cv
|
|
||||||
from esphomeyaml import pins
|
from esphomeyaml import pins
|
||||||
from esphomeyaml.components import display, spi
|
from esphomeyaml.components import display, spi
|
||||||
from esphomeyaml.components.spi import SPIComponent
|
from esphomeyaml.components.spi import SPIComponent
|
||||||
|
import esphomeyaml.config_validation as cv
|
||||||
from esphomeyaml.const import CONF_CS_PIN, CONF_ID, CONF_INTENSITY, CONF_LAMBDA, CONF_NUM_CHIPS, \
|
from esphomeyaml.const import CONF_CS_PIN, CONF_ID, CONF_INTENSITY, CONF_LAMBDA, CONF_NUM_CHIPS, \
|
||||||
CONF_SPI_ID
|
CONF_SPI_ID
|
||||||
from esphomeyaml.helpers import App, Pvariable, add, get_variable, gpio_output_pin_expression, \
|
from esphomeyaml.cpp_generator import Pvariable, add, get_variable, process_lambda
|
||||||
process_lambda, setup_component, PollingComponent
|
from esphomeyaml.cpp_helpers import gpio_output_pin_expression, setup_component
|
||||||
|
from esphomeyaml.cpp_types import App, PollingComponent, void
|
||||||
|
|
||||||
DEPENDENCIES = ['spi']
|
DEPENDENCIES = ['spi']
|
||||||
|
|
||||||
|
@ -38,7 +39,8 @@ def to_code(config):
|
||||||
add(max7219.set_intensity(config[CONF_INTENSITY]))
|
add(max7219.set_intensity(config[CONF_INTENSITY]))
|
||||||
|
|
||||||
if CONF_LAMBDA in config:
|
if CONF_LAMBDA in config:
|
||||||
for lambda_ in process_lambda(config[CONF_LAMBDA], [(MAX7219ComponentRef, 'it')]):
|
for lambda_ in process_lambda(config[CONF_LAMBDA], [(MAX7219ComponentRef, 'it')],
|
||||||
|
return_type=void):
|
||||||
yield
|
yield
|
||||||
add(max7219.set_writer(lambda_))
|
add(max7219.set_writer(lambda_))
|
||||||
|
|
||||||
|
|
|
@ -2,9 +2,9 @@ from esphomeyaml.components import display, uart
|
||||||
from esphomeyaml.components.uart import UARTComponent
|
from esphomeyaml.components.uart import UARTComponent
|
||||||
import esphomeyaml.config_validation as cv
|
import esphomeyaml.config_validation as cv
|
||||||
from esphomeyaml.const import CONF_ID, CONF_LAMBDA, CONF_UART_ID
|
from esphomeyaml.const import CONF_ID, CONF_LAMBDA, CONF_UART_ID
|
||||||
from esphomeyaml.helpers import App, PollingComponent, Pvariable, add, get_variable, \
|
from esphomeyaml.cpp_generator import Pvariable, add, get_variable, process_lambda
|
||||||
process_lambda, \
|
from esphomeyaml.cpp_helpers import setup_component
|
||||||
setup_component
|
from esphomeyaml.cpp_types import App, PollingComponent, void
|
||||||
|
|
||||||
DEPENDENCIES = ['uart']
|
DEPENDENCIES = ['uart']
|
||||||
|
|
||||||
|
@ -24,7 +24,8 @@ def to_code(config):
|
||||||
nextion = Pvariable(config[CONF_ID], rhs)
|
nextion = Pvariable(config[CONF_ID], rhs)
|
||||||
|
|
||||||
if CONF_LAMBDA in config:
|
if CONF_LAMBDA in config:
|
||||||
for lambda_ in process_lambda(config[CONF_LAMBDA], [(NextionRef, 'it')]):
|
for lambda_ in process_lambda(config[CONF_LAMBDA], [(NextionRef, 'it')],
|
||||||
|
return_type=void):
|
||||||
yield
|
yield
|
||||||
add(nextion.set_writer(lambda_))
|
add(nextion.set_writer(lambda_))
|
||||||
|
|
||||||
|
|
|
@ -1,13 +1,14 @@
|
||||||
import voluptuous as vol
|
import voluptuous as vol
|
||||||
|
|
||||||
import esphomeyaml.config_validation as cv
|
|
||||||
from esphomeyaml import pins
|
from esphomeyaml import pins
|
||||||
from esphomeyaml.components import display
|
from esphomeyaml.components import display
|
||||||
from esphomeyaml.components.display import ssd1306_spi
|
from esphomeyaml.components.display import ssd1306_spi
|
||||||
from esphomeyaml.const import CONF_ADDRESS, CONF_EXTERNAL_VCC, CONF_ID, \
|
import esphomeyaml.config_validation as cv
|
||||||
CONF_MODEL, CONF_RESET_PIN, CONF_LAMBDA
|
from esphomeyaml.const import CONF_ADDRESS, CONF_EXTERNAL_VCC, CONF_ID, CONF_LAMBDA, CONF_MODEL, \
|
||||||
from esphomeyaml.helpers import App, Pvariable, add, \
|
CONF_RESET_PIN
|
||||||
gpio_output_pin_expression, process_lambda, setup_component
|
from esphomeyaml.cpp_generator import Pvariable, add, process_lambda
|
||||||
|
from esphomeyaml.cpp_helpers import gpio_output_pin_expression, setup_component
|
||||||
|
from esphomeyaml.cpp_types import App, void
|
||||||
|
|
||||||
DEPENDENCIES = ['i2c']
|
DEPENDENCIES = ['i2c']
|
||||||
|
|
||||||
|
@ -36,7 +37,7 @@ def to_code(config):
|
||||||
add(ssd.set_address(config[CONF_ADDRESS]))
|
add(ssd.set_address(config[CONF_ADDRESS]))
|
||||||
if CONF_LAMBDA in config:
|
if CONF_LAMBDA in config:
|
||||||
for lambda_ in process_lambda(config[CONF_LAMBDA],
|
for lambda_ in process_lambda(config[CONF_LAMBDA],
|
||||||
[(display.DisplayBufferRef, 'it')]):
|
[(display.DisplayBufferRef, 'it')], return_type=void):
|
||||||
yield
|
yield
|
||||||
add(ssd.set_writer(lambda_))
|
add(ssd.set_writer(lambda_))
|
||||||
|
|
||||||
|
|
|
@ -1,14 +1,14 @@
|
||||||
import voluptuous as vol
|
import voluptuous as vol
|
||||||
|
|
||||||
import esphomeyaml.config_validation as cv
|
|
||||||
from esphomeyaml import pins
|
from esphomeyaml import pins
|
||||||
from esphomeyaml.components import display, spi
|
from esphomeyaml.components import display, spi
|
||||||
from esphomeyaml.components.spi import SPIComponent
|
from esphomeyaml.components.spi import SPIComponent
|
||||||
from esphomeyaml.const import CONF_CS_PIN, CONF_DC_PIN, CONF_EXTERNAL_VCC, \
|
import esphomeyaml.config_validation as cv
|
||||||
CONF_ID, CONF_MODEL, \
|
from esphomeyaml.const import CONF_CS_PIN, CONF_DC_PIN, CONF_EXTERNAL_VCC, CONF_ID, CONF_LAMBDA, \
|
||||||
CONF_RESET_PIN, CONF_SPI_ID, CONF_LAMBDA
|
CONF_MODEL, CONF_RESET_PIN, CONF_SPI_ID
|
||||||
from esphomeyaml.helpers import App, Pvariable, add, get_variable, \
|
from esphomeyaml.cpp_generator import Pvariable, add, get_variable, process_lambda
|
||||||
gpio_output_pin_expression, process_lambda, setup_component, PollingComponent
|
from esphomeyaml.cpp_helpers import gpio_output_pin_expression, setup_component
|
||||||
|
from esphomeyaml.cpp_types import App, PollingComponent, void
|
||||||
|
|
||||||
DEPENDENCIES = ['spi']
|
DEPENDENCIES = ['spi']
|
||||||
|
|
||||||
|
@ -27,7 +27,7 @@ MODELS = {
|
||||||
'SH1106_64X48': SSD1306Model.SH1106_MODEL_64_48,
|
'SH1106_64X48': SSD1306Model.SH1106_MODEL_64_48,
|
||||||
}
|
}
|
||||||
|
|
||||||
SSD1306_MODEL = vol.All(vol.Upper, vol.Replace(' ', '_'), cv.one_of(*MODELS))
|
SSD1306_MODEL = cv.one_of(*MODELS, upper=True, space="_")
|
||||||
|
|
||||||
PLATFORM_SCHEMA = display.FULL_DISPLAY_PLATFORM_SCHEMA.extend({
|
PLATFORM_SCHEMA = display.FULL_DISPLAY_PLATFORM_SCHEMA.extend({
|
||||||
cv.GenerateID(): cv.declare_variable_id(SPISSD1306),
|
cv.GenerateID(): cv.declare_variable_id(SPISSD1306),
|
||||||
|
@ -60,7 +60,7 @@ def to_code(config):
|
||||||
add(ssd.set_external_vcc(config[CONF_EXTERNAL_VCC]))
|
add(ssd.set_external_vcc(config[CONF_EXTERNAL_VCC]))
|
||||||
if CONF_LAMBDA in config:
|
if CONF_LAMBDA in config:
|
||||||
for lambda_ in process_lambda(config[CONF_LAMBDA],
|
for lambda_ in process_lambda(config[CONF_LAMBDA],
|
||||||
[(display.DisplayBufferRef, 'it')]):
|
[(display.DisplayBufferRef, 'it')], return_type=void):
|
||||||
yield
|
yield
|
||||||
add(ssd.set_writer(lambda_))
|
add(ssd.set_writer(lambda_))
|
||||||
|
|
||||||
|
|
|
@ -6,8 +6,10 @@ from esphomeyaml.components import display, spi
|
||||||
from esphomeyaml.components.spi import SPIComponent
|
from esphomeyaml.components.spi import SPIComponent
|
||||||
from esphomeyaml.const import CONF_BUSY_PIN, CONF_CS_PIN, CONF_DC_PIN, CONF_FULL_UPDATE_EVERY, \
|
from esphomeyaml.const import CONF_BUSY_PIN, CONF_CS_PIN, CONF_DC_PIN, CONF_FULL_UPDATE_EVERY, \
|
||||||
CONF_ID, CONF_LAMBDA, CONF_MODEL, CONF_RESET_PIN, CONF_SPI_ID
|
CONF_ID, CONF_LAMBDA, CONF_MODEL, CONF_RESET_PIN, CONF_SPI_ID
|
||||||
from esphomeyaml.helpers import App, Pvariable, add, get_variable, gpio_input_pin_expression, \
|
from esphomeyaml.cpp_generator import get_variable, Pvariable, process_lambda, add
|
||||||
gpio_output_pin_expression, process_lambda, setup_component, PollingComponent
|
from esphomeyaml.cpp_helpers import gpio_output_pin_expression, gpio_input_pin_expression, \
|
||||||
|
setup_component
|
||||||
|
from esphomeyaml.cpp_types import PollingComponent, App, void
|
||||||
|
|
||||||
DEPENDENCIES = ['spi']
|
DEPENDENCIES = ['spi']
|
||||||
|
|
||||||
|
@ -43,7 +45,7 @@ PLATFORM_SCHEMA = vol.All(display.FULL_DISPLAY_PLATFORM_SCHEMA.extend({
|
||||||
cv.GenerateID(CONF_SPI_ID): cv.use_variable_id(SPIComponent),
|
cv.GenerateID(CONF_SPI_ID): cv.use_variable_id(SPIComponent),
|
||||||
vol.Required(CONF_CS_PIN): pins.gpio_output_pin_schema,
|
vol.Required(CONF_CS_PIN): pins.gpio_output_pin_schema,
|
||||||
vol.Required(CONF_DC_PIN): pins.gpio_output_pin_schema,
|
vol.Required(CONF_DC_PIN): pins.gpio_output_pin_schema,
|
||||||
vol.Required(CONF_MODEL): vol.All(vol.Lower, cv.one_of(*MODELS)),
|
vol.Required(CONF_MODEL): cv.one_of(*MODELS, lower=True),
|
||||||
vol.Optional(CONF_RESET_PIN): pins.gpio_output_pin_schema,
|
vol.Optional(CONF_RESET_PIN): pins.gpio_output_pin_schema,
|
||||||
vol.Optional(CONF_BUSY_PIN): pins.gpio_input_pin_schema,
|
vol.Optional(CONF_BUSY_PIN): pins.gpio_input_pin_schema,
|
||||||
vol.Optional(CONF_FULL_UPDATE_EVERY): cv.uint32_t,
|
vol.Optional(CONF_FULL_UPDATE_EVERY): cv.uint32_t,
|
||||||
|
@ -69,7 +71,8 @@ def to_code(config):
|
||||||
raise NotImplementedError()
|
raise NotImplementedError()
|
||||||
|
|
||||||
if CONF_LAMBDA in config:
|
if CONF_LAMBDA in config:
|
||||||
for lambda_ in process_lambda(config[CONF_LAMBDA], [(display.DisplayBufferRef, 'it')]):
|
for lambda_ in process_lambda(config[CONF_LAMBDA], [(display.DisplayBufferRef, 'it')],
|
||||||
|
return_type=void):
|
||||||
yield
|
yield
|
||||||
add(epaper.set_writer(lambda_))
|
add(epaper.set_writer(lambda_))
|
||||||
if CONF_RESET_PIN in config:
|
if CONF_RESET_PIN in config:
|
||||||
|
|
|
@ -2,8 +2,9 @@ import voluptuous as vol
|
||||||
|
|
||||||
from esphomeyaml import config_validation as cv
|
from esphomeyaml import config_validation as cv
|
||||||
from esphomeyaml.const import CONF_ID, CONF_SCAN_INTERVAL, CONF_TYPE, CONF_UUID, ESP_PLATFORM_ESP32
|
from esphomeyaml.const import CONF_ID, CONF_SCAN_INTERVAL, CONF_TYPE, CONF_UUID, ESP_PLATFORM_ESP32
|
||||||
from esphomeyaml.helpers import App, ArrayInitializer, Component, Pvariable, RawExpression, add, \
|
from esphomeyaml.cpp_generator import ArrayInitializer, Pvariable, RawExpression, add
|
||||||
esphomelib_ns, setup_component
|
from esphomeyaml.cpp_helpers import setup_component
|
||||||
|
from esphomeyaml.cpp_types import App, Component, esphomelib_ns
|
||||||
|
|
||||||
ESP_PLATFORMS = [ESP_PLATFORM_ESP32]
|
ESP_PLATFORMS = [ESP_PLATFORM_ESP32]
|
||||||
|
|
||||||
|
@ -14,7 +15,7 @@ CONF_MINOR = 'minor'
|
||||||
|
|
||||||
CONFIG_SCHEMA = vol.Schema({
|
CONFIG_SCHEMA = vol.Schema({
|
||||||
cv.GenerateID(): cv.declare_variable_id(ESP32BLEBeacon),
|
cv.GenerateID(): cv.declare_variable_id(ESP32BLEBeacon),
|
||||||
vol.Required(CONF_TYPE): vol.All(vol.Upper, cv.one_of('IBEACON')),
|
vol.Required(CONF_TYPE): cv.one_of('IBEACON', upper=True),
|
||||||
vol.Required(CONF_UUID): cv.uuid,
|
vol.Required(CONF_UUID): cv.uuid,
|
||||||
vol.Optional(CONF_MAJOR): cv.uint16_t,
|
vol.Optional(CONF_MAJOR): cv.uint16_t,
|
||||||
vol.Optional(CONF_MINOR): cv.uint16_t,
|
vol.Optional(CONF_MINOR): cv.uint16_t,
|
||||||
|
|
|
@ -4,8 +4,9 @@ from esphomeyaml import config_validation as cv
|
||||||
from esphomeyaml.components import sensor
|
from esphomeyaml.components import sensor
|
||||||
from esphomeyaml.const import CONF_ID, CONF_SCAN_INTERVAL, ESP_PLATFORM_ESP32
|
from esphomeyaml.const import CONF_ID, CONF_SCAN_INTERVAL, ESP_PLATFORM_ESP32
|
||||||
from esphomeyaml.core import HexInt
|
from esphomeyaml.core import HexInt
|
||||||
from esphomeyaml.helpers import App, Pvariable, add, esphomelib_ns, ArrayInitializer, \
|
from esphomeyaml.cpp_generator import ArrayInitializer, Pvariable, add
|
||||||
setup_component, Component
|
from esphomeyaml.cpp_helpers import setup_component
|
||||||
|
from esphomeyaml.cpp_types import App, Component, esphomelib_ns
|
||||||
|
|
||||||
ESP_PLATFORMS = [ESP_PLATFORM_ESP32]
|
ESP_PLATFORMS = [ESP_PLATFORM_ESP32]
|
||||||
|
|
||||||
|
|
|
@ -2,11 +2,13 @@ import voluptuous as vol
|
||||||
|
|
||||||
from esphomeyaml import config_validation as cv
|
from esphomeyaml import config_validation as cv
|
||||||
from esphomeyaml.components import binary_sensor
|
from esphomeyaml.components import binary_sensor
|
||||||
from esphomeyaml.const import CONF_ID, CONF_SETUP_MODE, CONF_IIR_FILTER, \
|
from esphomeyaml.const import CONF_HIGH_VOLTAGE_REFERENCE, CONF_ID, CONF_IIR_FILTER, \
|
||||||
CONF_SLEEP_DURATION, CONF_MEASUREMENT_DURATION, CONF_LOW_VOLTAGE_REFERENCE, \
|
CONF_LOW_VOLTAGE_REFERENCE, CONF_MEASUREMENT_DURATION, CONF_SETUP_MODE, CONF_SLEEP_DURATION, \
|
||||||
CONF_HIGH_VOLTAGE_REFERENCE, CONF_VOLTAGE_ATTENUATION, ESP_PLATFORM_ESP32
|
CONF_VOLTAGE_ATTENUATION, ESP_PLATFORM_ESP32
|
||||||
from esphomeyaml.core import TimePeriod
|
from esphomeyaml.core import TimePeriod
|
||||||
from esphomeyaml.helpers import App, Pvariable, add, global_ns, setup_component, Component
|
from esphomeyaml.cpp_generator import Pvariable, add
|
||||||
|
from esphomeyaml.cpp_helpers import setup_component
|
||||||
|
from esphomeyaml.cpp_types import App, Component, global_ns
|
||||||
|
|
||||||
ESP_PLATFORMS = [ESP_PLATFORM_ESP32]
|
ESP_PLATFORMS = [ESP_PLATFORM_ESP32]
|
||||||
|
|
||||||
|
@ -19,6 +21,7 @@ def validate_voltage(values):
|
||||||
if not value.endswith('V'):
|
if not value.endswith('V'):
|
||||||
value += 'V'
|
value += 'V'
|
||||||
return cv.one_of(*values)(value)
|
return cv.one_of(*values)(value)
|
||||||
|
|
||||||
return validator
|
return validator
|
||||||
|
|
||||||
|
|
||||||
|
|
73
esphomeyaml/components/ethernet.py
Normal file
|
@ -0,0 +1,73 @@
|
||||||
|
import voluptuous as vol
|
||||||
|
|
||||||
|
from esphomeyaml import pins
|
||||||
|
from esphomeyaml.components import wifi
|
||||||
|
import esphomeyaml.config_validation as cv
|
||||||
|
from esphomeyaml.const import CONF_DOMAIN, CONF_HOSTNAME, CONF_ID, CONF_MANUAL_IP, CONF_TYPE, \
|
||||||
|
ESP_PLATFORM_ESP32
|
||||||
|
from esphomeyaml.cpp_generator import Pvariable, add
|
||||||
|
from esphomeyaml.cpp_helpers import gpio_output_pin_expression
|
||||||
|
from esphomeyaml.cpp_types import App, Component, esphomelib_ns, global_ns
|
||||||
|
|
||||||
|
CONFLICTS_WITH = ['wifi']
|
||||||
|
ESP_PLATFORMS = [ESP_PLATFORM_ESP32]
|
||||||
|
|
||||||
|
CONF_PHY_ADDR = 'phy_addr'
|
||||||
|
CONF_MDC_PIN = 'mdc_pin'
|
||||||
|
CONF_MDIO_PIN = 'mdio_pin'
|
||||||
|
CONF_CLK_MODE = 'clk_mode'
|
||||||
|
CONF_POWER_PIN = 'power_pin'
|
||||||
|
|
||||||
|
EthernetType = esphomelib_ns.enum('EthernetType')
|
||||||
|
ETHERNET_TYPES = {
|
||||||
|
'LAN8720': EthernetType.ETHERNET_TYPE_LAN8720,
|
||||||
|
'TLK110': EthernetType.ETHERNET_TYPE_TLK110,
|
||||||
|
}
|
||||||
|
|
||||||
|
eth_clock_mode_t = global_ns.enum('eth_clock_mode_t')
|
||||||
|
CLK_MODES = {
|
||||||
|
'GPIO0_IN': eth_clock_mode_t.ETH_CLOCK_GPIO0_IN,
|
||||||
|
'GPIO0_OUT': eth_clock_mode_t.ETH_CLOCK_GPIO0_OUT,
|
||||||
|
'GPIO16_OUT': eth_clock_mode_t.ETH_CLOCK_GPIO16_OUT,
|
||||||
|
'GPIO17_OUT': eth_clock_mode_t.ETH_CLOCK_GPIO17_OUT,
|
||||||
|
}
|
||||||
|
|
||||||
|
EthernetComponent = esphomelib_ns.class_('EthernetComponent', Component)
|
||||||
|
|
||||||
|
CONFIG_SCHEMA = vol.Schema({
|
||||||
|
cv.GenerateID(): cv.declare_variable_id(EthernetComponent),
|
||||||
|
vol.Required(CONF_TYPE): cv.one_of(*ETHERNET_TYPES, upper=True),
|
||||||
|
vol.Required(CONF_MDC_PIN): pins.output_pin,
|
||||||
|
vol.Required(CONF_MDIO_PIN): pins.input_output_pin,
|
||||||
|
vol.Optional(CONF_CLK_MODE, default='GPIO0_IN'): cv.one_of(*CLK_MODES, upper=True, space='_'),
|
||||||
|
vol.Optional(CONF_PHY_ADDR, default=0): vol.All(cv.int_, vol.Range(min=0, max=31)),
|
||||||
|
vol.Optional(CONF_POWER_PIN): pins.gpio_output_pin_schema,
|
||||||
|
vol.Optional(CONF_MANUAL_IP): wifi.STA_MANUAL_IP_SCHEMA,
|
||||||
|
vol.Optional(CONF_HOSTNAME): cv.hostname,
|
||||||
|
vol.Optional(CONF_DOMAIN, default='.local'): cv.domain_name,
|
||||||
|
})
|
||||||
|
|
||||||
|
|
||||||
|
def to_code(config):
|
||||||
|
rhs = App.init_ethernet()
|
||||||
|
eth = Pvariable(config[CONF_ID], rhs)
|
||||||
|
|
||||||
|
add(eth.set_phy_addr(config[CONF_PHY_ADDR]))
|
||||||
|
add(eth.set_mdc_pin(config[CONF_MDC_PIN]))
|
||||||
|
add(eth.set_mdio_pin(config[CONF_MDIO_PIN]))
|
||||||
|
add(eth.set_type(ETHERNET_TYPES[config[CONF_TYPE]]))
|
||||||
|
add(eth.set_clk_mode(CLK_MODES[config[CONF_CLK_MODE]]))
|
||||||
|
|
||||||
|
if CONF_POWER_PIN in config:
|
||||||
|
for pin in gpio_output_pin_expression(config[CONF_POWER_PIN]):
|
||||||
|
yield
|
||||||
|
add(eth.set_power_pin(pin))
|
||||||
|
|
||||||
|
if CONF_HOSTNAME in config:
|
||||||
|
add(eth.set_hostname(config[CONF_HOSTNAME]))
|
||||||
|
|
||||||
|
if CONF_MANUAL_IP in config:
|
||||||
|
add(eth.set_manual_ip(wifi.manual_ip(config[CONF_MANUAL_IP])))
|
||||||
|
|
||||||
|
|
||||||
|
REQUIRED_BUILD_FLAGS = '-DUSE_ETHERNET'
|
|
@ -1,13 +1,14 @@
|
||||||
import voluptuous as vol
|
import voluptuous as vol
|
||||||
|
|
||||||
from esphomeyaml.automation import maybe_simple_id, ACTION_REGISTRY
|
from esphomeyaml.automation import ACTION_REGISTRY, maybe_simple_id
|
||||||
from esphomeyaml.components import mqtt
|
from esphomeyaml.components import mqtt
|
||||||
|
from esphomeyaml.components.mqtt import setup_mqtt_component
|
||||||
import esphomeyaml.config_validation as cv
|
import esphomeyaml.config_validation as cv
|
||||||
from esphomeyaml.const import CONF_ID, CONF_MQTT_ID, CONF_OSCILLATION_COMMAND_TOPIC, \
|
from esphomeyaml.const import CONF_ID, CONF_INTERNAL, CONF_MQTT_ID, CONF_NAME, CONF_OSCILLATING, \
|
||||||
CONF_OSCILLATION_STATE_TOPIC, CONF_SPEED_COMMAND_TOPIC, CONF_SPEED_STATE_TOPIC, CONF_INTERNAL, \
|
CONF_OSCILLATION_COMMAND_TOPIC, CONF_OSCILLATION_OUTPUT, CONF_OSCILLATION_STATE_TOPIC, \
|
||||||
CONF_SPEED, CONF_OSCILLATING, CONF_OSCILLATION_OUTPUT, CONF_NAME
|
CONF_SPEED, CONF_SPEED_COMMAND_TOPIC, CONF_SPEED_STATE_TOPIC
|
||||||
from esphomeyaml.helpers import Application, Pvariable, add, esphomelib_ns, setup_mqtt_component, \
|
from esphomeyaml.cpp_generator import add, Pvariable, get_variable, templatable
|
||||||
TemplateArguments, get_variable, templatable, bool_, Action, Nameable, Component
|
from esphomeyaml.cpp_types import Application, Component, Nameable, esphomelib_ns, Action, bool_
|
||||||
|
|
||||||
PLATFORM_SCHEMA = cv.PLATFORM_SCHEMA.extend({
|
PLATFORM_SCHEMA = cv.PLATFORM_SCHEMA.extend({
|
||||||
|
|
||||||
|
@ -32,13 +33,14 @@ FAN_SPEED_HIGH = FanSpeed.FAN_SPEED_HIGH
|
||||||
FAN_SCHEMA = cv.MQTT_COMMAND_COMPONENT_SCHEMA.extend({
|
FAN_SCHEMA = cv.MQTT_COMMAND_COMPONENT_SCHEMA.extend({
|
||||||
cv.GenerateID(): cv.declare_variable_id(FanState),
|
cv.GenerateID(): cv.declare_variable_id(FanState),
|
||||||
cv.GenerateID(CONF_MQTT_ID): cv.declare_variable_id(MQTTFanComponent),
|
cv.GenerateID(CONF_MQTT_ID): cv.declare_variable_id(MQTTFanComponent),
|
||||||
vol.Optional(CONF_OSCILLATION_STATE_TOPIC): cv.publish_topic,
|
vol.Optional(CONF_OSCILLATION_STATE_TOPIC): vol.All(cv.requires_component('mqtt'),
|
||||||
vol.Optional(CONF_OSCILLATION_COMMAND_TOPIC): cv.subscribe_topic,
|
cv.publish_topic),
|
||||||
|
vol.Optional(CONF_OSCILLATION_COMMAND_TOPIC): vol.All(cv.requires_component('mqtt'),
|
||||||
|
cv.subscribe_topic),
|
||||||
})
|
})
|
||||||
|
|
||||||
FAN_PLATFORM_SCHEMA = PLATFORM_SCHEMA.extend(FAN_SCHEMA.schema)
|
FAN_PLATFORM_SCHEMA = PLATFORM_SCHEMA.extend(FAN_SCHEMA.schema)
|
||||||
|
|
||||||
|
|
||||||
FAN_SPEEDS = {
|
FAN_SPEEDS = {
|
||||||
'OFF': FAN_SPEED_OFF,
|
'OFF': FAN_SPEED_OFF,
|
||||||
'LOW': FAN_SPEED_LOW,
|
'LOW': FAN_SPEED_LOW,
|
||||||
|
@ -47,10 +49,6 @@ FAN_SPEEDS = {
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
def validate_fan_speed(value):
|
|
||||||
return vol.All(vol.Upper, cv.one_of(*FAN_SPEEDS))(value)
|
|
||||||
|
|
||||||
|
|
||||||
def setup_fan_core_(fan_var, mqtt_var, config):
|
def setup_fan_core_(fan_var, mqtt_var, config):
|
||||||
if CONF_INTERNAL in config:
|
if CONF_INTERNAL in config:
|
||||||
add(fan_var.set_internal(config[CONF_INTERNAL]))
|
add(fan_var.set_internal(config[CONF_INTERNAL]))
|
||||||
|
@ -74,7 +72,6 @@ def setup_fan(fan_obj, mqtt_obj, config):
|
||||||
|
|
||||||
BUILD_FLAGS = '-DUSE_FAN'
|
BUILD_FLAGS = '-DUSE_FAN'
|
||||||
|
|
||||||
|
|
||||||
CONF_FAN_TOGGLE = 'fan.toggle'
|
CONF_FAN_TOGGLE = 'fan.toggle'
|
||||||
FAN_TOGGLE_ACTION_SCHEMA = maybe_simple_id({
|
FAN_TOGGLE_ACTION_SCHEMA = maybe_simple_id({
|
||||||
vol.Required(CONF_ID): cv.use_variable_id(FanState),
|
vol.Required(CONF_ID): cv.use_variable_id(FanState),
|
||||||
|
@ -82,8 +79,7 @@ FAN_TOGGLE_ACTION_SCHEMA = maybe_simple_id({
|
||||||
|
|
||||||
|
|
||||||
@ACTION_REGISTRY.register(CONF_FAN_TOGGLE, FAN_TOGGLE_ACTION_SCHEMA)
|
@ACTION_REGISTRY.register(CONF_FAN_TOGGLE, FAN_TOGGLE_ACTION_SCHEMA)
|
||||||
def fan_toggle_to_code(config, action_id, arg_type):
|
def fan_toggle_to_code(config, action_id, arg_type, template_arg):
|
||||||
template_arg = TemplateArguments(arg_type)
|
|
||||||
for var in get_variable(config[CONF_ID]):
|
for var in get_variable(config[CONF_ID]):
|
||||||
yield None
|
yield None
|
||||||
rhs = var.make_toggle_action(template_arg)
|
rhs = var.make_toggle_action(template_arg)
|
||||||
|
@ -98,8 +94,7 @@ FAN_TURN_OFF_ACTION_SCHEMA = maybe_simple_id({
|
||||||
|
|
||||||
|
|
||||||
@ACTION_REGISTRY.register(CONF_FAN_TURN_OFF, FAN_TURN_OFF_ACTION_SCHEMA)
|
@ACTION_REGISTRY.register(CONF_FAN_TURN_OFF, FAN_TURN_OFF_ACTION_SCHEMA)
|
||||||
def fan_turn_off_to_code(config, action_id, arg_type):
|
def fan_turn_off_to_code(config, action_id, arg_type, template_arg):
|
||||||
template_arg = TemplateArguments(arg_type)
|
|
||||||
for var in get_variable(config[CONF_ID]):
|
for var in get_variable(config[CONF_ID]):
|
||||||
yield None
|
yield None
|
||||||
rhs = var.make_turn_off_action(template_arg)
|
rhs = var.make_turn_off_action(template_arg)
|
||||||
|
@ -111,13 +106,12 @@ CONF_FAN_TURN_ON = 'fan.turn_on'
|
||||||
FAN_TURN_ON_ACTION_SCHEMA = maybe_simple_id({
|
FAN_TURN_ON_ACTION_SCHEMA = maybe_simple_id({
|
||||||
vol.Required(CONF_ID): cv.use_variable_id(FanState),
|
vol.Required(CONF_ID): cv.use_variable_id(FanState),
|
||||||
vol.Optional(CONF_OSCILLATING): cv.templatable(cv.boolean),
|
vol.Optional(CONF_OSCILLATING): cv.templatable(cv.boolean),
|
||||||
vol.Optional(CONF_SPEED): cv.templatable(validate_fan_speed),
|
vol.Optional(CONF_SPEED): cv.templatable(cv.one_of(*FAN_SPEEDS, upper=True)),
|
||||||
})
|
})
|
||||||
|
|
||||||
|
|
||||||
@ACTION_REGISTRY.register(CONF_FAN_TURN_ON, FAN_TURN_ON_ACTION_SCHEMA)
|
@ACTION_REGISTRY.register(CONF_FAN_TURN_ON, FAN_TURN_ON_ACTION_SCHEMA)
|
||||||
def fan_turn_on_to_code(config, action_id, arg_type):
|
def fan_turn_on_to_code(config, action_id, arg_type, template_arg):
|
||||||
template_arg = TemplateArguments(arg_type)
|
|
||||||
for var in get_variable(config[CONF_ID]):
|
for var in get_variable(config[CONF_ID]):
|
||||||
yield None
|
yield None
|
||||||
rhs = var.make_turn_on_action(template_arg)
|
rhs = var.make_turn_on_action(template_arg)
|
||||||
|
|
|
@ -3,7 +3,9 @@ import voluptuous as vol
|
||||||
import esphomeyaml.config_validation as cv
|
import esphomeyaml.config_validation as cv
|
||||||
from esphomeyaml.components import fan, output
|
from esphomeyaml.components import fan, output
|
||||||
from esphomeyaml.const import CONF_MAKE_ID, CONF_NAME, CONF_OSCILLATION_OUTPUT, CONF_OUTPUT
|
from esphomeyaml.const import CONF_MAKE_ID, CONF_NAME, CONF_OSCILLATION_OUTPUT, CONF_OUTPUT
|
||||||
from esphomeyaml.helpers import App, add, get_variable, variable, setup_component
|
from esphomeyaml.cpp_generator import get_variable, variable, add
|
||||||
|
from esphomeyaml.cpp_helpers import setup_component
|
||||||
|
from esphomeyaml.cpp_types import App
|
||||||
|
|
||||||
PLATFORM_SCHEMA = cv.nameable(fan.FAN_PLATFORM_SCHEMA.extend({
|
PLATFORM_SCHEMA = cv.nameable(fan.FAN_PLATFORM_SCHEMA.extend({
|
||||||
cv.GenerateID(CONF_MAKE_ID): cv.declare_variable_id(fan.MakeFan),
|
cv.GenerateID(CONF_MAKE_ID): cv.declare_variable_id(fan.MakeFan),
|
||||||
|
|
|
@ -5,7 +5,8 @@ from esphomeyaml.components import fan, mqtt, output
|
||||||
from esphomeyaml.const import CONF_HIGH, CONF_LOW, CONF_MAKE_ID, CONF_MEDIUM, CONF_NAME, \
|
from esphomeyaml.const import CONF_HIGH, CONF_LOW, CONF_MAKE_ID, CONF_MEDIUM, CONF_NAME, \
|
||||||
CONF_OSCILLATION_OUTPUT, CONF_OUTPUT, CONF_SPEED, CONF_SPEED_COMMAND_TOPIC, \
|
CONF_OSCILLATION_OUTPUT, CONF_OUTPUT, CONF_SPEED, CONF_SPEED_COMMAND_TOPIC, \
|
||||||
CONF_SPEED_STATE_TOPIC
|
CONF_SPEED_STATE_TOPIC
|
||||||
from esphomeyaml.helpers import App, add, get_variable, variable
|
from esphomeyaml.cpp_generator import get_variable, variable, add
|
||||||
|
from esphomeyaml.cpp_types import App
|
||||||
|
|
||||||
PLATFORM_SCHEMA = cv.nameable(fan.FAN_PLATFORM_SCHEMA.extend({
|
PLATFORM_SCHEMA = cv.nameable(fan.FAN_PLATFORM_SCHEMA.extend({
|
||||||
cv.GenerateID(CONF_MAKE_ID): cv.declare_variable_id(fan.MakeFan),
|
cv.GenerateID(CONF_MAKE_ID): cv.declare_variable_id(fan.MakeFan),
|
||||||
|
|
|
@ -1,15 +1,17 @@
|
||||||
# coding=utf-8
|
# coding=utf-8
|
||||||
import voluptuous as vol
|
import voluptuous as vol
|
||||||
|
|
||||||
import esphomeyaml.config_validation as cv
|
|
||||||
from esphomeyaml import core
|
from esphomeyaml import core
|
||||||
from esphomeyaml.components import display
|
from esphomeyaml.components import display
|
||||||
|
import esphomeyaml.config_validation as cv
|
||||||
from esphomeyaml.const import CONF_FILE, CONF_GLYPHS, CONF_ID, CONF_SIZE
|
from esphomeyaml.const import CONF_FILE, CONF_GLYPHS, CONF_ID, CONF_SIZE
|
||||||
from esphomeyaml.core import HexInt
|
from esphomeyaml.core import CORE, HexInt
|
||||||
from esphomeyaml.helpers import App, ArrayInitializer, MockObj, Pvariable, RawExpression, add, \
|
from esphomeyaml.cpp_generator import ArrayInitializer, MockObj, Pvariable, RawExpression, add
|
||||||
relative_path
|
from esphomeyaml.cpp_types import App
|
||||||
|
from esphomeyaml.py_compat import sort_by_cmp
|
||||||
|
|
||||||
DEPENDENCIES = ['display']
|
DEPENDENCIES = ['display']
|
||||||
|
MULTI_CONF = True
|
||||||
|
|
||||||
Font = display.display_ns.class_('Font')
|
Font = display.display_ns.class_('Font')
|
||||||
Glyph = display.display_ns.class_('Glyph')
|
Glyph = display.display_ns.class_('Glyph')
|
||||||
|
@ -32,12 +34,11 @@ def validate_glyphs(value):
|
||||||
|
|
||||||
if len(x_) < len(y_):
|
if len(x_) < len(y_):
|
||||||
return -1
|
return -1
|
||||||
elif len(x_) > len(y_):
|
if len(x_) > len(y_):
|
||||||
return 1
|
return 1
|
||||||
else:
|
|
||||||
raise vol.Invalid(u"Found duplicate glyph {}".format(x))
|
raise vol.Invalid(u"Found duplicate glyph {}".format(x))
|
||||||
|
|
||||||
value.sort(cmp=comparator)
|
sort_by_cmp(value, comparator)
|
||||||
return value
|
return value
|
||||||
|
|
||||||
|
|
||||||
|
@ -46,11 +47,11 @@ def validate_pillow_installed(value):
|
||||||
import PIL
|
import PIL
|
||||||
except ImportError:
|
except ImportError:
|
||||||
raise vol.Invalid("Please install the pillow python package to use this feature. "
|
raise vol.Invalid("Please install the pillow python package to use this feature. "
|
||||||
"(pip2 install pillow)")
|
"(pip install pillow)")
|
||||||
|
|
||||||
if PIL.__version__[0] < '4':
|
if PIL.__version__[0] < '4':
|
||||||
raise vol.Invalid("Please update your pillow installation to at least 4.0.x. "
|
raise vol.Invalid("Please update your pillow installation to at least 4.0.x. "
|
||||||
"(pip2 install -U pillow)")
|
"(pip install -U pillow)")
|
||||||
|
|
||||||
return value
|
return value
|
||||||
|
|
||||||
|
@ -76,24 +77,23 @@ FONT_SCHEMA = vol.Schema({
|
||||||
cv.GenerateID(CONF_RAW_DATA_ID): cv.declare_variable_id(None),
|
cv.GenerateID(CONF_RAW_DATA_ID): cv.declare_variable_id(None),
|
||||||
})
|
})
|
||||||
|
|
||||||
CONFIG_SCHEMA = vol.All(validate_pillow_installed, cv.ensure_list, [FONT_SCHEMA])
|
CONFIG_SCHEMA = vol.All(validate_pillow_installed, FONT_SCHEMA)
|
||||||
|
|
||||||
|
|
||||||
def to_code(config):
|
def to_code(config):
|
||||||
from PIL import ImageFont
|
from PIL import ImageFont
|
||||||
|
|
||||||
for conf in config:
|
path = CORE.relative_path(config[CONF_FILE])
|
||||||
path = relative_path(conf[CONF_FILE])
|
|
||||||
try:
|
try:
|
||||||
font = ImageFont.truetype(path, conf[CONF_SIZE])
|
font = ImageFont.truetype(path, config[CONF_SIZE])
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
raise core.ESPHomeYAMLError(u"Could not load truetype file {}: {}".format(path, e))
|
raise core.EsphomeyamlError(u"Could not load truetype file {}: {}".format(path, e))
|
||||||
|
|
||||||
ascent, descent = font.getmetrics()
|
ascent, descent = font.getmetrics()
|
||||||
|
|
||||||
glyph_args = {}
|
glyph_args = {}
|
||||||
data = []
|
data = []
|
||||||
for glyph in conf[CONF_GLYPHS]:
|
for glyph in config[CONF_GLYPHS]:
|
||||||
mask = font.getmask(glyph, mode='1')
|
mask = font.getmask(glyph, mode='1')
|
||||||
_, (offset_x, offset_y) = font.font.getsize(glyph)
|
_, (offset_x, offset_y) = font.font.getsize(glyph)
|
||||||
width, height = mask.size
|
width, height = mask.size
|
||||||
|
@ -108,14 +108,14 @@ def to_code(config):
|
||||||
glyph_args[glyph] = (len(data), offset_x, offset_y, width, height)
|
glyph_args[glyph] = (len(data), offset_x, offset_y, width, height)
|
||||||
data += glyph_data
|
data += glyph_data
|
||||||
|
|
||||||
raw_data = MockObj(conf[CONF_RAW_DATA_ID])
|
raw_data = MockObj(config[CONF_RAW_DATA_ID])
|
||||||
add(RawExpression('static const uint8_t {}[{}] PROGMEM = {}'.format(
|
add(RawExpression('static const uint8_t {}[{}] PROGMEM = {}'.format(
|
||||||
raw_data, len(data),
|
raw_data, len(data),
|
||||||
ArrayInitializer(*[HexInt(x) for x in data], multiline=False))))
|
ArrayInitializer(*[HexInt(x) for x in data], multiline=False))))
|
||||||
|
|
||||||
glyphs = []
|
glyphs = []
|
||||||
for glyph in conf[CONF_GLYPHS]:
|
for glyph in config[CONF_GLYPHS]:
|
||||||
glyphs.append(Glyph(glyph, raw_data, *glyph_args[glyph]))
|
glyphs.append(Glyph(glyph, raw_data, *glyph_args[glyph]))
|
||||||
|
|
||||||
rhs = App.make_font(ArrayInitializer(*glyphs), ascent, ascent + descent)
|
rhs = App.make_font(ArrayInitializer(*glyphs), ascent, ascent + descent)
|
||||||
Pvariable(conf[CONF_ID], rhs)
|
Pvariable(config[CONF_ID], rhs)
|
||||||
|
|
|
@ -2,34 +2,34 @@ import voluptuous as vol
|
||||||
|
|
||||||
from esphomeyaml import config_validation as cv
|
from esphomeyaml import config_validation as cv
|
||||||
from esphomeyaml.const import CONF_ID, CONF_INITIAL_VALUE, CONF_RESTORE_VALUE, CONF_TYPE
|
from esphomeyaml.const import CONF_ID, CONF_INITIAL_VALUE, CONF_RESTORE_VALUE, CONF_TYPE
|
||||||
from esphomeyaml.helpers import App, Component, Pvariable, RawExpression, TemplateArguments, add, \
|
from esphomeyaml.cpp_generator import Pvariable, RawExpression, TemplateArguments, add
|
||||||
esphomelib_ns, setup_component
|
from esphomeyaml.cpp_helpers import setup_component
|
||||||
|
from esphomeyaml.cpp_types import App, Component, esphomelib_ns
|
||||||
|
|
||||||
GlobalVariableComponent = esphomelib_ns.class_('GlobalVariableComponent', Component)
|
GlobalVariableComponent = esphomelib_ns.class_('GlobalVariableComponent', Component)
|
||||||
|
|
||||||
GLOBAL_VAR_SCHEMA = vol.Schema({
|
MULTI_CONF = True
|
||||||
|
|
||||||
|
CONFIG_SCHEMA = vol.Schema({
|
||||||
vol.Required(CONF_ID): cv.declare_variable_id(GlobalVariableComponent),
|
vol.Required(CONF_ID): cv.declare_variable_id(GlobalVariableComponent),
|
||||||
vol.Required(CONF_TYPE): cv.string_strict,
|
vol.Required(CONF_TYPE): cv.string_strict,
|
||||||
vol.Optional(CONF_INITIAL_VALUE): cv.string_strict,
|
vol.Optional(CONF_INITIAL_VALUE): cv.string_strict,
|
||||||
vol.Optional(CONF_RESTORE_VALUE): cv.boolean,
|
vol.Optional(CONF_RESTORE_VALUE): cv.boolean,
|
||||||
}).extend(cv.COMPONENT_SCHEMA.schema)
|
}).extend(cv.COMPONENT_SCHEMA.schema)
|
||||||
|
|
||||||
CONFIG_SCHEMA = vol.All(cv.ensure_list, [GLOBAL_VAR_SCHEMA])
|
|
||||||
|
|
||||||
|
|
||||||
def to_code(config):
|
def to_code(config):
|
||||||
for conf in config:
|
type_ = RawExpression(config[CONF_TYPE])
|
||||||
type_ = RawExpression(conf[CONF_TYPE])
|
|
||||||
template_args = TemplateArguments(type_)
|
template_args = TemplateArguments(type_)
|
||||||
res_type = GlobalVariableComponent.template(template_args)
|
res_type = GlobalVariableComponent.template(template_args)
|
||||||
initial_value = None
|
initial_value = None
|
||||||
if CONF_INITIAL_VALUE in conf:
|
if CONF_INITIAL_VALUE in config:
|
||||||
initial_value = RawExpression(conf[CONF_INITIAL_VALUE])
|
initial_value = RawExpression(config[CONF_INITIAL_VALUE])
|
||||||
rhs = App.Pmake_global_variable(template_args, initial_value)
|
rhs = App.Pmake_global_variable(template_args, initial_value)
|
||||||
glob = Pvariable(conf[CONF_ID], rhs, type=res_type)
|
glob = Pvariable(config[CONF_ID], rhs, type=res_type)
|
||||||
|
|
||||||
if conf.get(CONF_RESTORE_VALUE, False):
|
if config.get(CONF_RESTORE_VALUE, False):
|
||||||
hash_ = hash(conf[CONF_ID].id) % 2**32
|
hash_ = hash(config[CONF_ID].id) % 2**32
|
||||||
add(glob.set_restore_value(hash_))
|
add(glob.set_restore_value(hash_))
|
||||||
|
|
||||||
setup_component(glob, conf)
|
setup_component(glob, config)
|
||||||
|
|
|
@ -1,18 +1,20 @@
|
||||||
import voluptuous as vol
|
import voluptuous as vol
|
||||||
|
|
||||||
import esphomeyaml.config_validation as cv
|
|
||||||
from esphomeyaml import pins
|
from esphomeyaml import pins
|
||||||
from esphomeyaml.const import CONF_FREQUENCY, CONF_SCL, CONF_SDA, CONF_SCAN, CONF_ID, \
|
import esphomeyaml.config_validation as cv
|
||||||
CONF_RECEIVE_TIMEOUT
|
from esphomeyaml.const import CONF_FREQUENCY, CONF_ID, CONF_RECEIVE_TIMEOUT, CONF_SCAN, CONF_SCL, \
|
||||||
from esphomeyaml.helpers import App, add, Pvariable, esphomelib_ns, setup_component, Component
|
CONF_SDA
|
||||||
|
from esphomeyaml.cpp_generator import Pvariable, add
|
||||||
|
from esphomeyaml.cpp_helpers import setup_component
|
||||||
|
from esphomeyaml.cpp_types import App, Component, esphomelib_ns
|
||||||
|
|
||||||
I2CComponent = esphomelib_ns.class_('I2CComponent', Component)
|
I2CComponent = esphomelib_ns.class_('I2CComponent', Component)
|
||||||
I2CDevice = pins.I2CDevice
|
I2CDevice = pins.I2CDevice
|
||||||
|
|
||||||
CONFIG_SCHEMA = vol.Schema({
|
CONFIG_SCHEMA = vol.Schema({
|
||||||
cv.GenerateID(): cv.declare_variable_id(I2CComponent),
|
cv.GenerateID(): cv.declare_variable_id(I2CComponent),
|
||||||
vol.Required(CONF_SDA, default='SDA'): pins.input_output_pin,
|
vol.Optional(CONF_SDA, default='SDA'): pins.input_pullup_pin,
|
||||||
vol.Required(CONF_SCL, default='SCL'): pins.input_output_pin,
|
vol.Optional(CONF_SCL, default='SCL'): pins.input_pullup_pin,
|
||||||
vol.Optional(CONF_FREQUENCY): vol.All(cv.frequency, vol.Range(min=0, min_included=False)),
|
vol.Optional(CONF_FREQUENCY): vol.All(cv.frequency, vol.Range(min=0, min_included=False)),
|
||||||
vol.Optional(CONF_SCAN): cv.boolean,
|
vol.Optional(CONF_SCAN): cv.boolean,
|
||||||
|
|
||||||
|
|
|
@ -3,17 +3,18 @@ import logging
|
||||||
|
|
||||||
import voluptuous as vol
|
import voluptuous as vol
|
||||||
|
|
||||||
import esphomeyaml.config_validation as cv
|
|
||||||
from esphomeyaml import core
|
from esphomeyaml import core
|
||||||
from esphomeyaml.components import display, font
|
from esphomeyaml.components import display, font
|
||||||
|
import esphomeyaml.config_validation as cv
|
||||||
from esphomeyaml.const import CONF_FILE, CONF_ID, CONF_RESIZE
|
from esphomeyaml.const import CONF_FILE, CONF_ID, CONF_RESIZE
|
||||||
from esphomeyaml.core import HexInt
|
from esphomeyaml.core import CORE, HexInt
|
||||||
from esphomeyaml.helpers import App, ArrayInitializer, MockObj, Pvariable, RawExpression, add, \
|
from esphomeyaml.cpp_generator import ArrayInitializer, MockObj, Pvariable, RawExpression, add
|
||||||
relative_path
|
from esphomeyaml.cpp_types import App
|
||||||
|
|
||||||
_LOGGER = logging.getLogger(__name__)
|
_LOGGER = logging.getLogger(__name__)
|
||||||
|
|
||||||
DEPENDENCIES = ['display']
|
DEPENDENCIES = ['display']
|
||||||
|
MULTI_CONF = True
|
||||||
|
|
||||||
Image_ = display.display_ns.class_('Image')
|
Image_ = display.display_ns.class_('Image')
|
||||||
|
|
||||||
|
@ -26,21 +27,20 @@ IMAGE_SCHEMA = vol.Schema({
|
||||||
cv.GenerateID(CONF_RAW_DATA_ID): cv.declare_variable_id(None),
|
cv.GenerateID(CONF_RAW_DATA_ID): cv.declare_variable_id(None),
|
||||||
})
|
})
|
||||||
|
|
||||||
CONFIG_SCHEMA = vol.All(font.validate_pillow_installed, cv.ensure_list, [IMAGE_SCHEMA])
|
CONFIG_SCHEMA = vol.All(font.validate_pillow_installed, IMAGE_SCHEMA)
|
||||||
|
|
||||||
|
|
||||||
def to_code(config):
|
def to_code(config):
|
||||||
from PIL import Image
|
from PIL import Image
|
||||||
|
|
||||||
for conf in config:
|
path = CORE.relative_path(config[CONF_FILE])
|
||||||
path = relative_path(conf[CONF_FILE])
|
|
||||||
try:
|
try:
|
||||||
image = Image.open(path)
|
image = Image.open(path)
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
raise core.ESPHomeYAMLError(u"Could not load image file {}: {}".format(path, e))
|
raise core.EsphomeyamlError(u"Could not load image file {}: {}".format(path, e))
|
||||||
|
|
||||||
if CONF_RESIZE in conf:
|
if CONF_RESIZE in config:
|
||||||
image.thumbnail(conf[CONF_RESIZE])
|
image.thumbnail(config[CONF_RESIZE])
|
||||||
|
|
||||||
image = image.convert('1', dither=Image.NONE)
|
image = image.convert('1', dither=Image.NONE)
|
||||||
width, height = image.size
|
width, height = image.size
|
||||||
|
@ -56,10 +56,10 @@ def to_code(config):
|
||||||
pos = x + y * width8
|
pos = x + y * width8
|
||||||
data[pos // 8] |= 0x80 >> (pos % 8)
|
data[pos // 8] |= 0x80 >> (pos % 8)
|
||||||
|
|
||||||
raw_data = MockObj(conf[CONF_RAW_DATA_ID])
|
raw_data = MockObj(config[CONF_RAW_DATA_ID])
|
||||||
add(RawExpression('static const uint8_t {}[{}] PROGMEM = {}'.format(
|
add(RawExpression('static const uint8_t {}[{}] PROGMEM = {}'.format(
|
||||||
raw_data, len(data),
|
raw_data, len(data),
|
||||||
ArrayInitializer(*[HexInt(x) for x in data], multiline=False))))
|
ArrayInitializer(*[HexInt(x) for x in data], multiline=False))))
|
||||||
|
|
||||||
rhs = App.make_image(raw_data, width, height)
|
rhs = App.make_image(raw_data, width, height)
|
||||||
Pvariable(conf[CONF_ID], rhs)
|
Pvariable(config[CONF_ID], rhs)
|
||||||
|
|
24
esphomeyaml/components/interval.py
Normal file
|
@ -0,0 +1,24 @@
|
||||||
|
import voluptuous as vol
|
||||||
|
|
||||||
|
from esphomeyaml import automation
|
||||||
|
import esphomeyaml.config_validation as cv
|
||||||
|
from esphomeyaml.const import CONF_ID, CONF_INTERVAL
|
||||||
|
from esphomeyaml.cpp_generator import Pvariable
|
||||||
|
from esphomeyaml.cpp_helpers import setup_component
|
||||||
|
from esphomeyaml.cpp_types import App, NoArg, PollingComponent, Trigger, esphomelib_ns
|
||||||
|
|
||||||
|
IntervalTrigger = esphomelib_ns.class_('IntervalTrigger', Trigger.template(NoArg), PollingComponent)
|
||||||
|
|
||||||
|
CONFIG_SCHEMA = automation.validate_automation(vol.Schema({
|
||||||
|
vol.Required(CONF_ID): cv.declare_variable_id(IntervalTrigger),
|
||||||
|
vol.Required(CONF_INTERVAL): cv.positive_time_period_milliseconds,
|
||||||
|
}).extend(cv.COMPONENT_SCHEMA.schema))
|
||||||
|
|
||||||
|
|
||||||
|
def to_code(config):
|
||||||
|
for conf in config:
|
||||||
|
rhs = App.register_component(IntervalTrigger.new(config[CONF_INTERVAL]))
|
||||||
|
trigger = Pvariable(conf[CONF_ID], rhs)
|
||||||
|
setup_component(trigger, conf)
|
||||||
|
|
||||||
|
automation.build_automation(trigger, NoArg, conf)
|
|
@ -1,7 +1,8 @@
|
||||||
import voluptuous as vol
|
import voluptuous as vol
|
||||||
|
|
||||||
from esphomeyaml.automation import maybe_simple_id, ACTION_REGISTRY
|
from esphomeyaml.automation import ACTION_REGISTRY, maybe_simple_id
|
||||||
from esphomeyaml.components import mqtt
|
from esphomeyaml.components import mqtt
|
||||||
|
from esphomeyaml.components.mqtt import setup_mqtt_component
|
||||||
import esphomeyaml.config_validation as cv
|
import esphomeyaml.config_validation as cv
|
||||||
from esphomeyaml.const import CONF_ALPHA, CONF_BLUE, CONF_BRIGHTNESS, CONF_COLORS, \
|
from esphomeyaml.const import CONF_ALPHA, CONF_BLUE, CONF_BRIGHTNESS, CONF_COLORS, \
|
||||||
CONF_DEFAULT_TRANSITION_LENGTH, CONF_DURATION, CONF_EFFECTS, CONF_EFFECT_ID, \
|
CONF_DEFAULT_TRANSITION_LENGTH, CONF_DURATION, CONF_EFFECTS, CONF_EFFECT_ID, \
|
||||||
|
@ -9,10 +10,11 @@ from esphomeyaml.const import CONF_ALPHA, CONF_BLUE, CONF_BRIGHTNESS, CONF_COLOR
|
||||||
CONF_NUM_LEDS, CONF_RANDOM, CONF_RED, CONF_SPEED, CONF_STATE, CONF_TRANSITION_LENGTH, \
|
CONF_NUM_LEDS, CONF_RANDOM, CONF_RED, CONF_SPEED, CONF_STATE, CONF_TRANSITION_LENGTH, \
|
||||||
CONF_UPDATE_INTERVAL, CONF_WHITE, CONF_WIDTH, CONF_FLASH_LENGTH, CONF_COLOR_TEMPERATURE, \
|
CONF_UPDATE_INTERVAL, CONF_WHITE, CONF_WIDTH, CONF_FLASH_LENGTH, CONF_COLOR_TEMPERATURE, \
|
||||||
CONF_EFFECT
|
CONF_EFFECT
|
||||||
from esphomeyaml.helpers import Application, ArrayInitializer, Pvariable, RawExpression, \
|
from esphomeyaml.core import CORE
|
||||||
StructInitializer, add, add_job, esphomelib_ns, process_lambda, setup_mqtt_component, \
|
from esphomeyaml.cpp_generator import process_lambda, Pvariable, add, StructInitializer, \
|
||||||
get_variable, TemplateArguments, templatable, uint32, float_, std_string, Nameable, Component, \
|
ArrayInitializer, get_variable, templatable
|
||||||
Action
|
from esphomeyaml.cpp_types import esphomelib_ns, Application, Component, Nameable, Action, uint32, \
|
||||||
|
float_, std_string, void
|
||||||
|
|
||||||
PLATFORM_SCHEMA = cv.PLATFORM_SCHEMA.extend({
|
PLATFORM_SCHEMA = cv.PLATFORM_SCHEMA.extend({
|
||||||
|
|
||||||
|
@ -23,7 +25,8 @@ light_ns = esphomelib_ns.namespace('light')
|
||||||
LightState = light_ns.class_('LightState', Nameable, Component)
|
LightState = light_ns.class_('LightState', Nameable, Component)
|
||||||
MakeLight = Application.struct('MakeLight')
|
MakeLight = Application.struct('MakeLight')
|
||||||
LightOutput = light_ns.class_('LightOutput')
|
LightOutput = light_ns.class_('LightOutput')
|
||||||
FastLEDLightOutputComponent = light_ns.class_('FastLEDLightOutputComponent', LightOutput)
|
AddressableLight = light_ns.class_('AddressableLight')
|
||||||
|
AddressableLightRef = AddressableLight.operator('ref')
|
||||||
|
|
||||||
# Actions
|
# Actions
|
||||||
ToggleAction = light_ns.class_('ToggleAction', Action)
|
ToggleAction = light_ns.class_('ToggleAction', Action)
|
||||||
|
@ -32,7 +35,6 @@ TurnOnAction = light_ns.class_('TurnOnAction', Action)
|
||||||
|
|
||||||
LightColorValues = light_ns.class_('LightColorValues')
|
LightColorValues = light_ns.class_('LightColorValues')
|
||||||
|
|
||||||
|
|
||||||
MQTTJSONLightComponent = light_ns.class_('MQTTJSONLightComponent', mqtt.MQTTComponent)
|
MQTTJSONLightComponent = light_ns.class_('MQTTJSONLightComponent', mqtt.MQTTComponent)
|
||||||
|
|
||||||
# Effects
|
# Effects
|
||||||
|
@ -42,28 +44,30 @@ LambdaLightEffect = light_ns.class_('LambdaLightEffect', LightEffect)
|
||||||
StrobeLightEffect = light_ns.class_('StrobeLightEffect', LightEffect)
|
StrobeLightEffect = light_ns.class_('StrobeLightEffect', LightEffect)
|
||||||
StrobeLightEffectColor = light_ns.class_('StrobeLightEffectColor', LightEffect)
|
StrobeLightEffectColor = light_ns.class_('StrobeLightEffectColor', LightEffect)
|
||||||
FlickerLightEffect = light_ns.class_('FlickerLightEffect', LightEffect)
|
FlickerLightEffect = light_ns.class_('FlickerLightEffect', LightEffect)
|
||||||
BaseFastLEDLightEffect = light_ns.class_('BaseFastLEDLightEffect', LightEffect)
|
AddressableLightEffect = light_ns.class_('AddressableLightEffect', LightEffect)
|
||||||
FastLEDLambdaLightEffect = light_ns.class_('FastLEDLambdaLightEffect', BaseFastLEDLightEffect)
|
AddressableLambdaLightEffect = light_ns.class_('AddressableLambdaLightEffect',
|
||||||
FastLEDRainbowLightEffect = light_ns.class_('FastLEDRainbowLightEffect', BaseFastLEDLightEffect)
|
AddressableLightEffect)
|
||||||
FastLEDColorWipeEffect = light_ns.class_('FastLEDColorWipeEffect', BaseFastLEDLightEffect)
|
AddressableRainbowLightEffect = light_ns.class_('AddressableRainbowLightEffect',
|
||||||
FastLEDColorWipeEffectColor = light_ns.class_('FastLEDColorWipeEffectColor', BaseFastLEDLightEffect)
|
AddressableLightEffect)
|
||||||
FastLEDScanEffect = light_ns.class_('FastLEDScanEffect', BaseFastLEDLightEffect)
|
AddressableColorWipeEffect = light_ns.class_('AddressableColorWipeEffect', AddressableLightEffect)
|
||||||
FastLEDScanEffectColor = light_ns.class_('FastLEDScanEffectColor', BaseFastLEDLightEffect)
|
AddressableColorWipeEffectColor = light_ns.struct('AddressableColorWipeEffectColor')
|
||||||
FastLEDTwinkleEffect = light_ns.class_('FastLEDTwinkleEffect', BaseFastLEDLightEffect)
|
AddressableScanEffect = light_ns.class_('AddressableScanEffect', AddressableLightEffect)
|
||||||
FastLEDRandomTwinkleEffect = light_ns.class_('FastLEDRandomTwinkleEffect', BaseFastLEDLightEffect)
|
AddressableTwinkleEffect = light_ns.class_('AddressableTwinkleEffect', AddressableLightEffect)
|
||||||
FastLEDFireworksEffect = light_ns.class_('FastLEDFireworksEffect', BaseFastLEDLightEffect)
|
AddressableRandomTwinkleEffect = light_ns.class_('AddressableRandomTwinkleEffect',
|
||||||
FastLEDFlickerEffect = light_ns.class_('FastLEDFlickerEffect', BaseFastLEDLightEffect)
|
AddressableLightEffect)
|
||||||
|
AddressableFireworksEffect = light_ns.class_('AddressableFireworksEffect', AddressableLightEffect)
|
||||||
|
AddressableFlickerEffect = light_ns.class_('AddressableFlickerEffect', AddressableLightEffect)
|
||||||
|
|
||||||
CONF_STROBE = 'strobe'
|
CONF_STROBE = 'strobe'
|
||||||
CONF_FLICKER = 'flicker'
|
CONF_FLICKER = 'flicker'
|
||||||
CONF_FASTLED_LAMBDA = 'fastled_lambda'
|
CONF_ADDRESSABLE_LAMBDA = 'addressable_lambda'
|
||||||
CONF_FASTLED_RAINBOW = 'fastled_rainbow'
|
CONF_ADDRESSABLE_RAINBOW = 'addressable_rainbow'
|
||||||
CONF_FASTLED_COLOR_WIPE = 'fastled_color_wipe'
|
CONF_ADDRESSABLE_COLOR_WIPE = 'addressable_color_wipe'
|
||||||
CONF_FASTLED_SCAN = 'fastled_scan'
|
CONF_ADDRESSABLE_SCAN = 'addressable_scan'
|
||||||
CONF_FASTLED_TWINKLE = 'fastled_twinkle'
|
CONF_ADDRESSABLE_TWINKLE = 'addressable_twinkle'
|
||||||
CONF_FASTLED_RANDOM_TWINKLE = 'fastled_random_twinkle'
|
CONF_ADDRESSABLE_RANDOM_TWINKLE = 'addressable_random_twinkle'
|
||||||
CONF_FASTLED_FIREWORKS = 'fastled_fireworks'
|
CONF_ADDRESSABLE_FIREWORKS = 'addressable_fireworks'
|
||||||
CONF_FASTLED_FLICKER = 'fastled_flicker'
|
CONF_ADDRESSABLE_FLICKER = 'addressable_flicker'
|
||||||
|
|
||||||
CONF_ADD_LED_INTERVAL = 'add_led_interval'
|
CONF_ADD_LED_INTERVAL = 'add_led_interval'
|
||||||
CONF_REVERSE = 'reverse'
|
CONF_REVERSE = 'reverse'
|
||||||
|
@ -78,10 +82,10 @@ CONF_INTENSITY = 'intensity'
|
||||||
BINARY_EFFECTS = [CONF_LAMBDA, CONF_STROBE]
|
BINARY_EFFECTS = [CONF_LAMBDA, CONF_STROBE]
|
||||||
MONOCHROMATIC_EFFECTS = BINARY_EFFECTS + [CONF_FLICKER]
|
MONOCHROMATIC_EFFECTS = BINARY_EFFECTS + [CONF_FLICKER]
|
||||||
RGB_EFFECTS = MONOCHROMATIC_EFFECTS + [CONF_RANDOM]
|
RGB_EFFECTS = MONOCHROMATIC_EFFECTS + [CONF_RANDOM]
|
||||||
FASTLED_EFFECTS = RGB_EFFECTS + [CONF_FASTLED_LAMBDA, CONF_FASTLED_RAINBOW, CONF_FASTLED_COLOR_WIPE,
|
ADDRESSABLE_EFFECTS = RGB_EFFECTS + [CONF_ADDRESSABLE_LAMBDA, CONF_ADDRESSABLE_RAINBOW,
|
||||||
CONF_FASTLED_SCAN, CONF_FASTLED_TWINKLE,
|
CONF_ADDRESSABLE_COLOR_WIPE, CONF_ADDRESSABLE_SCAN,
|
||||||
CONF_FASTLED_RANDOM_TWINKLE, CONF_FASTLED_FIREWORKS,
|
CONF_ADDRESSABLE_TWINKLE, CONF_ADDRESSABLE_RANDOM_TWINKLE,
|
||||||
CONF_FASTLED_FLICKER]
|
CONF_ADDRESSABLE_FIREWORKS, CONF_ADDRESSABLE_FLICKER]
|
||||||
|
|
||||||
EFFECTS_SCHEMA = vol.Schema({
|
EFFECTS_SCHEMA = vol.Schema({
|
||||||
vol.Optional(CONF_LAMBDA): vol.Schema({
|
vol.Optional(CONF_LAMBDA): vol.Schema({
|
||||||
|
@ -98,7 +102,7 @@ EFFECTS_SCHEMA = vol.Schema({
|
||||||
vol.Optional(CONF_STROBE): vol.Schema({
|
vol.Optional(CONF_STROBE): vol.Schema({
|
||||||
cv.GenerateID(CONF_EFFECT_ID): cv.declare_variable_id(StrobeLightEffect),
|
cv.GenerateID(CONF_EFFECT_ID): cv.declare_variable_id(StrobeLightEffect),
|
||||||
vol.Optional(CONF_NAME, default="Strobe"): cv.string,
|
vol.Optional(CONF_NAME, default="Strobe"): cv.string,
|
||||||
vol.Optional(CONF_COLORS): vol.All(cv.ensure_list, [vol.All(vol.Schema({
|
vol.Optional(CONF_COLORS): vol.All(cv.ensure_list(vol.Schema({
|
||||||
vol.Optional(CONF_STATE, default=True): cv.boolean,
|
vol.Optional(CONF_STATE, default=True): cv.boolean,
|
||||||
vol.Optional(CONF_BRIGHTNESS, default=1.0): cv.percentage,
|
vol.Optional(CONF_BRIGHTNESS, default=1.0): cv.percentage,
|
||||||
vol.Optional(CONF_RED, default=1.0): cv.percentage,
|
vol.Optional(CONF_RED, default=1.0): cv.percentage,
|
||||||
|
@ -107,7 +111,7 @@ EFFECTS_SCHEMA = vol.Schema({
|
||||||
vol.Optional(CONF_WHITE, default=1.0): cv.percentage,
|
vol.Optional(CONF_WHITE, default=1.0): cv.percentage,
|
||||||
vol.Required(CONF_DURATION): cv.positive_time_period_milliseconds,
|
vol.Required(CONF_DURATION): cv.positive_time_period_milliseconds,
|
||||||
}), cv.has_at_least_one_key(CONF_STATE, CONF_BRIGHTNESS, CONF_RED, CONF_GREEN, CONF_BLUE,
|
}), cv.has_at_least_one_key(CONF_STATE, CONF_BRIGHTNESS, CONF_RED, CONF_GREEN, CONF_BLUE,
|
||||||
CONF_WHITE))], vol.Length(min=2)),
|
CONF_WHITE)), vol.Length(min=2)),
|
||||||
}),
|
}),
|
||||||
vol.Optional(CONF_FLICKER): vol.Schema({
|
vol.Optional(CONF_FLICKER): vol.Schema({
|
||||||
cv.GenerateID(CONF_EFFECT_ID): cv.declare_variable_id(FlickerLightEffect),
|
cv.GenerateID(CONF_EFFECT_ID): cv.declare_variable_id(FlickerLightEffect),
|
||||||
|
@ -115,58 +119,59 @@ EFFECTS_SCHEMA = vol.Schema({
|
||||||
vol.Optional(CONF_ALPHA): cv.percentage,
|
vol.Optional(CONF_ALPHA): cv.percentage,
|
||||||
vol.Optional(CONF_INTENSITY): cv.percentage,
|
vol.Optional(CONF_INTENSITY): cv.percentage,
|
||||||
}),
|
}),
|
||||||
vol.Optional(CONF_FASTLED_LAMBDA): vol.Schema({
|
vol.Optional(CONF_ADDRESSABLE_LAMBDA): vol.Schema({
|
||||||
vol.Required(CONF_NAME): cv.string,
|
vol.Required(CONF_NAME): cv.string,
|
||||||
vol.Required(CONF_LAMBDA): cv.lambda_,
|
vol.Required(CONF_LAMBDA): cv.lambda_,
|
||||||
vol.Optional(CONF_UPDATE_INTERVAL, default='0ms'): cv.positive_time_period_milliseconds,
|
vol.Optional(CONF_UPDATE_INTERVAL, default='0ms'): cv.positive_time_period_milliseconds,
|
||||||
}),
|
}),
|
||||||
vol.Optional(CONF_FASTLED_RAINBOW): vol.Schema({
|
vol.Optional(CONF_ADDRESSABLE_RAINBOW): vol.Schema({
|
||||||
cv.GenerateID(CONF_EFFECT_ID): cv.declare_variable_id(FastLEDRainbowLightEffect),
|
cv.GenerateID(CONF_EFFECT_ID): cv.declare_variable_id(AddressableRainbowLightEffect),
|
||||||
vol.Optional(CONF_NAME, default="Rainbow"): cv.string,
|
vol.Optional(CONF_NAME, default="Rainbow"): cv.string,
|
||||||
vol.Optional(CONF_SPEED): cv.uint32_t,
|
vol.Optional(CONF_SPEED): cv.uint32_t,
|
||||||
vol.Optional(CONF_WIDTH): cv.uint32_t,
|
vol.Optional(CONF_WIDTH): cv.uint32_t,
|
||||||
}),
|
}),
|
||||||
vol.Optional(CONF_FASTLED_COLOR_WIPE): vol.Schema({
|
vol.Optional(CONF_ADDRESSABLE_COLOR_WIPE): vol.Schema({
|
||||||
cv.GenerateID(CONF_EFFECT_ID): cv.declare_variable_id(FastLEDColorWipeEffect),
|
cv.GenerateID(CONF_EFFECT_ID): cv.declare_variable_id(AddressableColorWipeEffect),
|
||||||
vol.Optional(CONF_NAME, default="Color Wipe"): cv.string,
|
vol.Optional(CONF_NAME, default="Color Wipe"): cv.string,
|
||||||
vol.Optional(CONF_COLORS): vol.All(cv.ensure_list, [vol.Schema({
|
vol.Optional(CONF_COLORS): cv.ensure_list({
|
||||||
vol.Optional(CONF_RED, default=1.0): cv.percentage,
|
vol.Optional(CONF_RED, default=1.0): cv.percentage,
|
||||||
vol.Optional(CONF_GREEN, default=1.0): cv.percentage,
|
vol.Optional(CONF_GREEN, default=1.0): cv.percentage,
|
||||||
vol.Optional(CONF_BLUE, default=1.0): cv.percentage,
|
vol.Optional(CONF_BLUE, default=1.0): cv.percentage,
|
||||||
|
vol.Optional(CONF_WHITE, default=1.0): cv.percentage,
|
||||||
vol.Optional(CONF_RANDOM, default=False): cv.boolean,
|
vol.Optional(CONF_RANDOM, default=False): cv.boolean,
|
||||||
vol.Required(CONF_NUM_LEDS): vol.All(cv.uint32_t, vol.Range(min=1)),
|
vol.Required(CONF_NUM_LEDS): vol.All(cv.uint32_t, vol.Range(min=1)),
|
||||||
})]),
|
}),
|
||||||
vol.Optional(CONF_ADD_LED_INTERVAL): cv.positive_time_period_milliseconds,
|
vol.Optional(CONF_ADD_LED_INTERVAL): cv.positive_time_period_milliseconds,
|
||||||
vol.Optional(CONF_REVERSE): cv.boolean,
|
vol.Optional(CONF_REVERSE): cv.boolean,
|
||||||
}),
|
}),
|
||||||
vol.Optional(CONF_FASTLED_SCAN): vol.Schema({
|
vol.Optional(CONF_ADDRESSABLE_SCAN): vol.Schema({
|
||||||
cv.GenerateID(CONF_EFFECT_ID): cv.declare_variable_id(FastLEDScanEffect),
|
cv.GenerateID(CONF_EFFECT_ID): cv.declare_variable_id(AddressableScanEffect),
|
||||||
vol.Optional(CONF_NAME, default="Scan"): cv.string,
|
vol.Optional(CONF_NAME, default="Scan"): cv.string,
|
||||||
vol.Optional(CONF_MOVE_INTERVAL): cv.positive_time_period_milliseconds,
|
vol.Optional(CONF_MOVE_INTERVAL): cv.positive_time_period_milliseconds,
|
||||||
}),
|
}),
|
||||||
vol.Optional(CONF_FASTLED_TWINKLE): vol.Schema({
|
vol.Optional(CONF_ADDRESSABLE_TWINKLE): vol.Schema({
|
||||||
cv.GenerateID(CONF_EFFECT_ID): cv.declare_variable_id(FastLEDTwinkleEffect),
|
cv.GenerateID(CONF_EFFECT_ID): cv.declare_variable_id(AddressableTwinkleEffect),
|
||||||
vol.Optional(CONF_NAME, default="Twinkle"): cv.string,
|
vol.Optional(CONF_NAME, default="Twinkle"): cv.string,
|
||||||
vol.Optional(CONF_TWINKLE_PROBABILITY): cv.percentage,
|
vol.Optional(CONF_TWINKLE_PROBABILITY): cv.percentage,
|
||||||
vol.Optional(CONF_PROGRESS_INTERVAL): cv.positive_time_period_milliseconds,
|
vol.Optional(CONF_PROGRESS_INTERVAL): cv.positive_time_period_milliseconds,
|
||||||
}),
|
}),
|
||||||
vol.Optional(CONF_FASTLED_RANDOM_TWINKLE): vol.Schema({
|
vol.Optional(CONF_ADDRESSABLE_RANDOM_TWINKLE): vol.Schema({
|
||||||
cv.GenerateID(CONF_EFFECT_ID): cv.declare_variable_id(FastLEDRandomTwinkleEffect),
|
cv.GenerateID(CONF_EFFECT_ID): cv.declare_variable_id(AddressableRandomTwinkleEffect),
|
||||||
vol.Optional(CONF_NAME, default="Random Twinkle"): cv.string,
|
vol.Optional(CONF_NAME, default="Random Twinkle"): cv.string,
|
||||||
vol.Optional(CONF_TWINKLE_PROBABILITY): cv.percentage,
|
vol.Optional(CONF_TWINKLE_PROBABILITY): cv.percentage,
|
||||||
vol.Optional(CONF_PROGRESS_INTERVAL): cv.positive_time_period_milliseconds,
|
vol.Optional(CONF_PROGRESS_INTERVAL): cv.positive_time_period_milliseconds,
|
||||||
}),
|
}),
|
||||||
vol.Optional(CONF_FASTLED_FIREWORKS): vol.Schema({
|
vol.Optional(CONF_ADDRESSABLE_FIREWORKS): vol.Schema({
|
||||||
cv.GenerateID(CONF_EFFECT_ID): cv.declare_variable_id(FastLEDFireworksEffect),
|
cv.GenerateID(CONF_EFFECT_ID): cv.declare_variable_id(AddressableFireworksEffect),
|
||||||
vol.Optional(CONF_NAME, default="Fireworks"): cv.string,
|
vol.Optional(CONF_NAME, default="Fireworks"): cv.string,
|
||||||
vol.Optional(CONF_UPDATE_INTERVAL): cv.positive_time_period_milliseconds,
|
vol.Optional(CONF_UPDATE_INTERVAL): cv.positive_time_period_milliseconds,
|
||||||
vol.Optional(CONF_SPARK_PROBABILITY): cv.percentage,
|
vol.Optional(CONF_SPARK_PROBABILITY): cv.percentage,
|
||||||
vol.Optional(CONF_USE_RANDOM_COLOR): cv.boolean,
|
vol.Optional(CONF_USE_RANDOM_COLOR): cv.boolean,
|
||||||
vol.Optional(CONF_FADE_OUT_RATE): cv.uint8_t,
|
vol.Optional(CONF_FADE_OUT_RATE): cv.uint8_t,
|
||||||
}),
|
}),
|
||||||
vol.Optional(CONF_FASTLED_FLICKER): vol.Schema({
|
vol.Optional(CONF_ADDRESSABLE_FLICKER): vol.Schema({
|
||||||
cv.GenerateID(CONF_EFFECT_ID): cv.declare_variable_id(FastLEDFlickerEffect),
|
cv.GenerateID(CONF_EFFECT_ID): cv.declare_variable_id(AddressableFlickerEffect),
|
||||||
vol.Optional(CONF_NAME, default="FastLED Flicker"): cv.string,
|
vol.Optional(CONF_NAME, default="Addressable Flicker"): cv.string,
|
||||||
vol.Optional(CONF_UPDATE_INTERVAL): cv.positive_time_period_milliseconds,
|
vol.Optional(CONF_UPDATE_INTERVAL): cv.positive_time_period_milliseconds,
|
||||||
vol.Optional(CONF_INTENSITY): cv.percentage,
|
vol.Optional(CONF_INTENSITY): cv.percentage,
|
||||||
}),
|
}),
|
||||||
|
@ -175,28 +180,64 @@ EFFECTS_SCHEMA = vol.Schema({
|
||||||
|
|
||||||
def validate_effects(allowed_effects):
|
def validate_effects(allowed_effects):
|
||||||
def validator(value):
|
def validator(value):
|
||||||
value = cv.ensure_list(value)
|
is_list = isinstance(value, list)
|
||||||
|
if not is_list:
|
||||||
|
value = [value]
|
||||||
names = set()
|
names = set()
|
||||||
ret = []
|
ret = []
|
||||||
|
errors = []
|
||||||
for i, effect in enumerate(value):
|
for i, effect in enumerate(value):
|
||||||
|
path = [i] if is_list else []
|
||||||
if not isinstance(effect, dict):
|
if not isinstance(effect, dict):
|
||||||
raise vol.Invalid("Each effect must be a dictionary, not {}".format(type(value)))
|
errors.append(
|
||||||
|
vol.Invalid("Each effect must be a dictionary, not {}".format(type(value)),
|
||||||
|
path)
|
||||||
|
)
|
||||||
|
continue
|
||||||
if len(effect) > 1:
|
if len(effect) > 1:
|
||||||
raise vol.Invalid("Each entry in the 'effects:' option must be a single effect.")
|
errors.append(
|
||||||
|
vol.Invalid("Each entry in the 'effects:' option must be a single effect.",
|
||||||
|
path)
|
||||||
|
)
|
||||||
|
continue
|
||||||
if not effect:
|
if not effect:
|
||||||
raise vol.Invalid("Found no effect for the {}th entry in 'effects:'!".format(i))
|
errors.append(
|
||||||
|
vol.Invalid("Found no effect for the {}th entry in 'effects:'!".format(i),
|
||||||
|
path)
|
||||||
|
)
|
||||||
|
continue
|
||||||
key = next(iter(effect.keys()))
|
key = next(iter(effect.keys()))
|
||||||
|
if key.startswith('fastled'):
|
||||||
|
errors.append(
|
||||||
|
vol.Invalid("FastLED effects have been renamed to addressable effects. "
|
||||||
|
"Please use '{}'".format(key.replace('fastled', 'addressable')),
|
||||||
|
path)
|
||||||
|
)
|
||||||
|
continue
|
||||||
if key not in allowed_effects:
|
if key not in allowed_effects:
|
||||||
raise vol.Invalid("The effect '{}' does not exist or is not allowed for this "
|
errors.append(
|
||||||
"light type".format(key))
|
vol.Invalid("The effect '{}' does not exist or is not allowed for this "
|
||||||
|
"light type".format(key), path)
|
||||||
|
)
|
||||||
|
continue
|
||||||
effect[key] = effect[key] or {}
|
effect[key] = effect[key] or {}
|
||||||
|
try:
|
||||||
conf = EFFECTS_SCHEMA(effect)
|
conf = EFFECTS_SCHEMA(effect)
|
||||||
|
except vol.Invalid as err:
|
||||||
|
err.prepend(path)
|
||||||
|
errors.append(err)
|
||||||
|
continue
|
||||||
name = conf[key][CONF_NAME]
|
name = conf[key][CONF_NAME]
|
||||||
if name in names:
|
if name in names:
|
||||||
raise vol.Invalid(u"Found the effect name '{}' twice. All effects must have "
|
errors.append(
|
||||||
u"unique names".format(name))
|
vol.Invalid(u"Found the effect name '{}' twice. All effects must have "
|
||||||
|
u"unique names".format(name), [i])
|
||||||
|
)
|
||||||
|
continue
|
||||||
names.add(name)
|
names.add(name)
|
||||||
ret.append(conf)
|
ret.append(conf)
|
||||||
|
if errors:
|
||||||
|
raise vol.MultipleInvalid(errors)
|
||||||
return ret
|
return ret
|
||||||
|
|
||||||
return validator
|
return validator
|
||||||
|
@ -214,7 +255,7 @@ def build_effect(full_config):
|
||||||
key, config = next(iter(full_config.items()))
|
key, config = next(iter(full_config.items()))
|
||||||
if key == CONF_LAMBDA:
|
if key == CONF_LAMBDA:
|
||||||
lambda_ = None
|
lambda_ = None
|
||||||
for lambda_ in process_lambda(config[CONF_LAMBDA], []):
|
for lambda_ in process_lambda(config[CONF_LAMBDA], [], return_type=void):
|
||||||
yield None
|
yield None
|
||||||
yield LambdaLightEffect.new(config[CONF_NAME], lambda_, config[CONF_UPDATE_INTERVAL])
|
yield LambdaLightEffect.new(config[CONF_NAME], lambda_, config[CONF_UPDATE_INTERVAL])
|
||||||
elif key == CONF_RANDOM:
|
elif key == CONF_RANDOM:
|
||||||
|
@ -248,22 +289,22 @@ def build_effect(full_config):
|
||||||
if CONF_INTENSITY in config:
|
if CONF_INTENSITY in config:
|
||||||
add(effect.set_intensity(config[CONF_INTENSITY]))
|
add(effect.set_intensity(config[CONF_INTENSITY]))
|
||||||
yield effect
|
yield effect
|
||||||
elif key == CONF_FASTLED_LAMBDA:
|
elif key == CONF_ADDRESSABLE_LAMBDA:
|
||||||
lambda_ = None
|
args = [(AddressableLightRef, 'it')]
|
||||||
args = [(RawExpression('FastLEDLightOutputComponent &'), 'it')]
|
for lambda_ in process_lambda(config[CONF_LAMBDA], args, return_type=void):
|
||||||
for lambda_ in process_lambda(config[CONF_LAMBDA], args):
|
|
||||||
yield None
|
yield None
|
||||||
yield FastLEDLambdaLightEffect.new(config[CONF_NAME], lambda_, config[CONF_UPDATE_INTERVAL])
|
yield AddressableLambdaLightEffect.new(config[CONF_NAME], lambda_,
|
||||||
elif key == CONF_FASTLED_RAINBOW:
|
config[CONF_UPDATE_INTERVAL])
|
||||||
rhs = FastLEDRainbowLightEffect.new(config[CONF_NAME])
|
elif key == CONF_ADDRESSABLE_RAINBOW:
|
||||||
|
rhs = AddressableRainbowLightEffect.new(config[CONF_NAME])
|
||||||
effect = Pvariable(config[CONF_EFFECT_ID], rhs)
|
effect = Pvariable(config[CONF_EFFECT_ID], rhs)
|
||||||
if CONF_SPEED in config:
|
if CONF_SPEED in config:
|
||||||
add(effect.set_speed(config[CONF_SPEED]))
|
add(effect.set_speed(config[CONF_SPEED]))
|
||||||
if CONF_WIDTH in config:
|
if CONF_WIDTH in config:
|
||||||
add(effect.set_width(config[CONF_WIDTH]))
|
add(effect.set_width(config[CONF_WIDTH]))
|
||||||
yield effect
|
yield effect
|
||||||
elif key == CONF_FASTLED_COLOR_WIPE:
|
elif key == CONF_ADDRESSABLE_COLOR_WIPE:
|
||||||
rhs = FastLEDColorWipeEffect.new(config[CONF_NAME])
|
rhs = AddressableColorWipeEffect.new(config[CONF_NAME])
|
||||||
effect = Pvariable(config[CONF_EFFECT_ID], rhs)
|
effect = Pvariable(config[CONF_EFFECT_ID], rhs)
|
||||||
if CONF_ADD_LED_INTERVAL in config:
|
if CONF_ADD_LED_INTERVAL in config:
|
||||||
add(effect.set_add_led_interval(config[CONF_ADD_LED_INTERVAL]))
|
add(effect.set_add_led_interval(config[CONF_ADD_LED_INTERVAL]))
|
||||||
|
@ -272,40 +313,41 @@ def build_effect(full_config):
|
||||||
colors = []
|
colors = []
|
||||||
for color in config.get(CONF_COLORS, []):
|
for color in config.get(CONF_COLORS, []):
|
||||||
colors.append(StructInitializer(
|
colors.append(StructInitializer(
|
||||||
FastLEDColorWipeEffectColor,
|
AddressableColorWipeEffectColor,
|
||||||
('r', color[CONF_RED]),
|
('r', int(round(color[CONF_RED] * 255))),
|
||||||
('g', color[CONF_GREEN]),
|
('g', int(round(color[CONF_GREEN] * 255))),
|
||||||
('b', color[CONF_BLUE]),
|
('b', int(round(color[CONF_BLUE] * 255))),
|
||||||
|
('w', int(round(color[CONF_WHITE] * 255))),
|
||||||
('random', color[CONF_RANDOM]),
|
('random', color[CONF_RANDOM]),
|
||||||
('num_leds', color[CONF_NUM_LEDS]),
|
('num_leds', color[CONF_NUM_LEDS]),
|
||||||
))
|
))
|
||||||
if colors:
|
if colors:
|
||||||
add(effect.set_colors(ArrayInitializer(*colors)))
|
add(effect.set_colors(ArrayInitializer(*colors)))
|
||||||
yield effect
|
yield effect
|
||||||
elif key == CONF_FASTLED_SCAN:
|
elif key == CONF_ADDRESSABLE_SCAN:
|
||||||
rhs = FastLEDScanEffect.new(config[CONF_NAME])
|
rhs = AddressableScanEffect.new(config[CONF_NAME])
|
||||||
effect = Pvariable(config[CONF_EFFECT_ID], rhs)
|
effect = Pvariable(config[CONF_EFFECT_ID], rhs)
|
||||||
if CONF_MOVE_INTERVAL in config:
|
if CONF_MOVE_INTERVAL in config:
|
||||||
add(effect.set_move_interval(config[CONF_MOVE_INTERVAL]))
|
add(effect.set_move_interval(config[CONF_MOVE_INTERVAL]))
|
||||||
yield effect
|
yield effect
|
||||||
elif key == CONF_FASTLED_TWINKLE:
|
elif key == CONF_ADDRESSABLE_TWINKLE:
|
||||||
rhs = FastLEDTwinkleEffect.new(config[CONF_NAME])
|
rhs = AddressableTwinkleEffect.new(config[CONF_NAME])
|
||||||
effect = Pvariable(config[CONF_EFFECT_ID], rhs)
|
effect = Pvariable(config[CONF_EFFECT_ID], rhs)
|
||||||
if CONF_TWINKLE_PROBABILITY in config:
|
if CONF_TWINKLE_PROBABILITY in config:
|
||||||
add(effect.set_twinkle_probability(config[CONF_TWINKLE_PROBABILITY]))
|
add(effect.set_twinkle_probability(config[CONF_TWINKLE_PROBABILITY]))
|
||||||
if CONF_PROGRESS_INTERVAL in config:
|
if CONF_PROGRESS_INTERVAL in config:
|
||||||
add(effect.set_progress_interval(config[CONF_PROGRESS_INTERVAL]))
|
add(effect.set_progress_interval(config[CONF_PROGRESS_INTERVAL]))
|
||||||
yield effect
|
yield effect
|
||||||
elif key == CONF_FASTLED_RANDOM_TWINKLE:
|
elif key == CONF_ADDRESSABLE_RANDOM_TWINKLE:
|
||||||
rhs = FastLEDRandomTwinkleEffect.new(config[CONF_NAME])
|
rhs = AddressableRandomTwinkleEffect.new(config[CONF_NAME])
|
||||||
effect = Pvariable(config[CONF_EFFECT_ID], rhs)
|
effect = Pvariable(config[CONF_EFFECT_ID], rhs)
|
||||||
if CONF_TWINKLE_PROBABILITY in config:
|
if CONF_TWINKLE_PROBABILITY in config:
|
||||||
add(effect.set_twinkle_probability(config[CONF_TWINKLE_PROBABILITY]))
|
add(effect.set_twinkle_probability(config[CONF_TWINKLE_PROBABILITY]))
|
||||||
if CONF_PROGRESS_INTERVAL in config:
|
if CONF_PROGRESS_INTERVAL in config:
|
||||||
add(effect.set_progress_interval(config[CONF_PROGRESS_INTERVAL]))
|
add(effect.set_progress_interval(config[CONF_PROGRESS_INTERVAL]))
|
||||||
yield effect
|
yield effect
|
||||||
elif key == CONF_FASTLED_FIREWORKS:
|
elif key == CONF_ADDRESSABLE_FIREWORKS:
|
||||||
rhs = FastLEDFireworksEffect.new(config[CONF_NAME])
|
rhs = AddressableFireworksEffect.new(config[CONF_NAME])
|
||||||
effect = Pvariable(config[CONF_EFFECT_ID], rhs)
|
effect = Pvariable(config[CONF_EFFECT_ID], rhs)
|
||||||
if CONF_UPDATE_INTERVAL in config:
|
if CONF_UPDATE_INTERVAL in config:
|
||||||
add(effect.set_update_interval(config[CONF_UPDATE_INTERVAL]))
|
add(effect.set_update_interval(config[CONF_UPDATE_INTERVAL]))
|
||||||
|
@ -316,8 +358,8 @@ def build_effect(full_config):
|
||||||
if CONF_FADE_OUT_RATE in config:
|
if CONF_FADE_OUT_RATE in config:
|
||||||
add(effect.set_spark_probability(config[CONF_FADE_OUT_RATE]))
|
add(effect.set_spark_probability(config[CONF_FADE_OUT_RATE]))
|
||||||
yield effect
|
yield effect
|
||||||
elif key == CONF_FASTLED_FLICKER:
|
elif key == CONF_ADDRESSABLE_FLICKER:
|
||||||
rhs = FastLEDFlickerEffect.new(config[CONF_NAME])
|
rhs = AddressableFlickerEffect.new(config[CONF_NAME])
|
||||||
effect = Pvariable(config[CONF_EFFECT_ID], rhs)
|
effect = Pvariable(config[CONF_EFFECT_ID], rhs)
|
||||||
if CONF_UPDATE_INTERVAL in config:
|
if CONF_UPDATE_INTERVAL in config:
|
||||||
add(effect.set_update_interval(config[CONF_UPDATE_INTERVAL]))
|
add(effect.set_update_interval(config[CONF_UPDATE_INTERVAL]))
|
||||||
|
@ -349,12 +391,11 @@ def setup_light_core_(light_var, mqtt_var, config):
|
||||||
def setup_light(light_obj, mqtt_obj, config):
|
def setup_light(light_obj, mqtt_obj, config):
|
||||||
light_var = Pvariable(config[CONF_ID], light_obj, has_side_effects=False)
|
light_var = Pvariable(config[CONF_ID], light_obj, has_side_effects=False)
|
||||||
mqtt_var = Pvariable(config[CONF_MQTT_ID], mqtt_obj, has_side_effects=False)
|
mqtt_var = Pvariable(config[CONF_MQTT_ID], mqtt_obj, has_side_effects=False)
|
||||||
add_job(setup_light_core_, light_var, mqtt_var, config)
|
CORE.add_job(setup_light_core_, light_var, mqtt_var, config)
|
||||||
|
|
||||||
|
|
||||||
BUILD_FLAGS = '-DUSE_LIGHT'
|
BUILD_FLAGS = '-DUSE_LIGHT'
|
||||||
|
|
||||||
|
|
||||||
CONF_LIGHT_TOGGLE = 'light.toggle'
|
CONF_LIGHT_TOGGLE = 'light.toggle'
|
||||||
LIGHT_TOGGLE_ACTION_SCHEMA = maybe_simple_id({
|
LIGHT_TOGGLE_ACTION_SCHEMA = maybe_simple_id({
|
||||||
vol.Required(CONF_ID): cv.use_variable_id(LightState),
|
vol.Required(CONF_ID): cv.use_variable_id(LightState),
|
||||||
|
@ -363,8 +404,7 @@ LIGHT_TOGGLE_ACTION_SCHEMA = maybe_simple_id({
|
||||||
|
|
||||||
|
|
||||||
@ACTION_REGISTRY.register(CONF_LIGHT_TOGGLE, LIGHT_TOGGLE_ACTION_SCHEMA)
|
@ACTION_REGISTRY.register(CONF_LIGHT_TOGGLE, LIGHT_TOGGLE_ACTION_SCHEMA)
|
||||||
def light_toggle_to_code(config, action_id, arg_type):
|
def light_toggle_to_code(config, action_id, arg_type, template_arg):
|
||||||
template_arg = TemplateArguments(arg_type)
|
|
||||||
for var in get_variable(config[CONF_ID]):
|
for var in get_variable(config[CONF_ID]):
|
||||||
yield None
|
yield None
|
||||||
rhs = var.make_toggle_action(template_arg)
|
rhs = var.make_toggle_action(template_arg)
|
||||||
|
@ -385,8 +425,7 @@ LIGHT_TURN_OFF_ACTION_SCHEMA = maybe_simple_id({
|
||||||
|
|
||||||
|
|
||||||
@ACTION_REGISTRY.register(CONF_LIGHT_TURN_OFF, LIGHT_TURN_OFF_ACTION_SCHEMA)
|
@ACTION_REGISTRY.register(CONF_LIGHT_TURN_OFF, LIGHT_TURN_OFF_ACTION_SCHEMA)
|
||||||
def light_turn_off_to_code(config, action_id, arg_type):
|
def light_turn_off_to_code(config, action_id, arg_type, template_arg):
|
||||||
template_arg = TemplateArguments(arg_type)
|
|
||||||
for var in get_variable(config[CONF_ID]):
|
for var in get_variable(config[CONF_ID]):
|
||||||
yield None
|
yield None
|
||||||
rhs = var.make_turn_off_action(template_arg)
|
rhs = var.make_turn_off_action(template_arg)
|
||||||
|
@ -417,8 +456,7 @@ LIGHT_TURN_ON_ACTION_SCHEMA = maybe_simple_id({
|
||||||
|
|
||||||
|
|
||||||
@ACTION_REGISTRY.register(CONF_LIGHT_TURN_ON, LIGHT_TURN_ON_ACTION_SCHEMA)
|
@ACTION_REGISTRY.register(CONF_LIGHT_TURN_ON, LIGHT_TURN_ON_ACTION_SCHEMA)
|
||||||
def light_turn_on_to_code(config, action_id, arg_type):
|
def light_turn_on_to_code(config, action_id, arg_type, template_arg):
|
||||||
template_arg = TemplateArguments(arg_type)
|
|
||||||
for var in get_variable(config[CONF_ID]):
|
for var in get_variable(config[CONF_ID]):
|
||||||
yield None
|
yield None
|
||||||
rhs = var.make_turn_on_action(template_arg)
|
rhs = var.make_turn_on_action(template_arg)
|
||||||
|
@ -465,10 +503,10 @@ def light_turn_on_to_code(config, action_id, arg_type):
|
||||||
|
|
||||||
def core_to_hass_config(data, config, brightness=True, rgb=True, color_temp=True,
|
def core_to_hass_config(data, config, brightness=True, rgb=True, color_temp=True,
|
||||||
white_value=True):
|
white_value=True):
|
||||||
ret = mqtt.build_hass_config(data, 'light', config, include_state=True, include_command=True,
|
ret = mqtt.build_hass_config(data, 'light', config, include_state=True, include_command=True)
|
||||||
platform='mqtt_json')
|
|
||||||
if ret is None:
|
if ret is None:
|
||||||
return None
|
return None
|
||||||
|
ret['schema'] = 'json'
|
||||||
if brightness:
|
if brightness:
|
||||||
ret['brightness'] = True
|
ret['brightness'] = True
|
||||||
if rgb:
|
if rgb:
|
||||||
|
|
|
@ -3,7 +3,9 @@ import voluptuous as vol
|
||||||
from esphomeyaml.components import light, output
|
from esphomeyaml.components import light, output
|
||||||
import esphomeyaml.config_validation as cv
|
import esphomeyaml.config_validation as cv
|
||||||
from esphomeyaml.const import CONF_EFFECTS, CONF_MAKE_ID, CONF_NAME, CONF_OUTPUT
|
from esphomeyaml.const import CONF_EFFECTS, CONF_MAKE_ID, CONF_NAME, CONF_OUTPUT
|
||||||
from esphomeyaml.helpers import App, get_variable, setup_component, variable
|
from esphomeyaml.cpp_generator import get_variable, variable
|
||||||
|
from esphomeyaml.cpp_helpers import setup_component
|
||||||
|
from esphomeyaml.cpp_types import App
|
||||||
|
|
||||||
PLATFORM_SCHEMA = cv.nameable(light.LIGHT_PLATFORM_SCHEMA.extend({
|
PLATFORM_SCHEMA = cv.nameable(light.LIGHT_PLATFORM_SCHEMA.extend({
|
||||||
cv.GenerateID(CONF_MAKE_ID): cv.declare_variable_id(light.MakeLight),
|
cv.GenerateID(CONF_MAKE_ID): cv.declare_variable_id(light.MakeLight),
|
||||||
|
|
|
@ -7,7 +7,9 @@ from esphomeyaml.components.light.rgbww import validate_cold_white_colder, \
|
||||||
from esphomeyaml.const import CONF_COLD_WHITE, CONF_COLD_WHITE_COLOR_TEMPERATURE, \
|
from esphomeyaml.const import CONF_COLD_WHITE, CONF_COLD_WHITE_COLOR_TEMPERATURE, \
|
||||||
CONF_DEFAULT_TRANSITION_LENGTH, CONF_EFFECTS, CONF_GAMMA_CORRECT, CONF_MAKE_ID, \
|
CONF_DEFAULT_TRANSITION_LENGTH, CONF_EFFECTS, CONF_GAMMA_CORRECT, CONF_MAKE_ID, \
|
||||||
CONF_NAME, CONF_WARM_WHITE, CONF_WARM_WHITE_COLOR_TEMPERATURE
|
CONF_NAME, CONF_WARM_WHITE, CONF_WARM_WHITE_COLOR_TEMPERATURE
|
||||||
from esphomeyaml.helpers import App, get_variable, variable, setup_component
|
from esphomeyaml.cpp_generator import get_variable, variable
|
||||||
|
from esphomeyaml.cpp_helpers import setup_component
|
||||||
|
from esphomeyaml.cpp_types import App
|
||||||
|
|
||||||
PLATFORM_SCHEMA = cv.nameable(light.LIGHT_PLATFORM_SCHEMA.extend({
|
PLATFORM_SCHEMA = cv.nameable(light.LIGHT_PLATFORM_SCHEMA.extend({
|
||||||
cv.GenerateID(CONF_MAKE_ID): cv.declare_variable_id(light.MakeLight),
|
cv.GenerateID(CONF_MAKE_ID): cv.declare_variable_id(light.MakeLight),
|
||||||
|
|
|
@ -1,14 +1,15 @@
|
||||||
import voluptuous as vol
|
import voluptuous as vol
|
||||||
|
|
||||||
import esphomeyaml.config_validation as cv
|
|
||||||
from esphomeyaml import pins
|
from esphomeyaml import pins
|
||||||
from esphomeyaml.components import light
|
from esphomeyaml.components import light
|
||||||
from esphomeyaml.components.power_supply import PowerSupplyComponent
|
from esphomeyaml.components.power_supply import PowerSupplyComponent
|
||||||
from esphomeyaml.const import CONF_CHIPSET, CONF_DEFAULT_TRANSITION_LENGTH, CONF_GAMMA_CORRECT, \
|
import esphomeyaml.config_validation as cv
|
||||||
CONF_MAKE_ID, CONF_MAX_REFRESH_RATE, CONF_NAME, CONF_NUM_LEDS, CONF_PIN, CONF_POWER_SUPPLY, \
|
from esphomeyaml.const import CONF_CHIPSET, CONF_COLOR_CORRECT, CONF_DEFAULT_TRANSITION_LENGTH, \
|
||||||
CONF_RGB_ORDER, CONF_EFFECTS, CONF_COLOR_CORRECT
|
CONF_EFFECTS, CONF_GAMMA_CORRECT, CONF_MAKE_ID, CONF_MAX_REFRESH_RATE, CONF_NAME, \
|
||||||
from esphomeyaml.helpers import App, Application, RawExpression, TemplateArguments, add, \
|
CONF_NUM_LEDS, CONF_PIN, CONF_POWER_SUPPLY, CONF_RGB_ORDER
|
||||||
get_variable, variable, setup_component
|
from esphomeyaml.cpp_generator import RawExpression, TemplateArguments, add, get_variable, variable
|
||||||
|
from esphomeyaml.cpp_helpers import setup_component
|
||||||
|
from esphomeyaml.cpp_types import App, Application
|
||||||
|
|
||||||
TYPES = [
|
TYPES = [
|
||||||
'NEOPIXEL',
|
'NEOPIXEL',
|
||||||
|
@ -58,18 +59,18 @@ MakeFastLEDLight = Application.struct('MakeFastLEDLight')
|
||||||
PLATFORM_SCHEMA = cv.nameable(light.LIGHT_PLATFORM_SCHEMA.extend({
|
PLATFORM_SCHEMA = cv.nameable(light.LIGHT_PLATFORM_SCHEMA.extend({
|
||||||
cv.GenerateID(CONF_MAKE_ID): cv.declare_variable_id(MakeFastLEDLight),
|
cv.GenerateID(CONF_MAKE_ID): cv.declare_variable_id(MakeFastLEDLight),
|
||||||
|
|
||||||
vol.Required(CONF_CHIPSET): vol.All(vol.Upper, cv.one_of(*TYPES)),
|
vol.Required(CONF_CHIPSET): cv.one_of(*TYPES, upper=True),
|
||||||
vol.Required(CONF_PIN): pins.output_pin,
|
vol.Required(CONF_PIN): pins.output_pin,
|
||||||
|
|
||||||
vol.Required(CONF_NUM_LEDS): cv.positive_not_null_int,
|
vol.Required(CONF_NUM_LEDS): cv.positive_not_null_int,
|
||||||
vol.Optional(CONF_MAX_REFRESH_RATE): cv.positive_time_period_microseconds,
|
vol.Optional(CONF_MAX_REFRESH_RATE): cv.positive_time_period_microseconds,
|
||||||
vol.Optional(CONF_RGB_ORDER): vol.All(vol.Upper, cv.one_of(*RGB_ORDERS)),
|
vol.Optional(CONF_RGB_ORDER): cv.one_of(*RGB_ORDERS, upper=True),
|
||||||
|
|
||||||
vol.Optional(CONF_GAMMA_CORRECT): cv.positive_float,
|
vol.Optional(CONF_GAMMA_CORRECT): cv.positive_float,
|
||||||
vol.Optional(CONF_COLOR_CORRECT): vol.All([cv.percentage], vol.Length(min=3, max=3)),
|
vol.Optional(CONF_COLOR_CORRECT): vol.All([cv.percentage], vol.Length(min=3, max=3)),
|
||||||
vol.Optional(CONF_DEFAULT_TRANSITION_LENGTH): cv.positive_time_period_milliseconds,
|
vol.Optional(CONF_DEFAULT_TRANSITION_LENGTH): cv.positive_time_period_milliseconds,
|
||||||
vol.Optional(CONF_POWER_SUPPLY): cv.use_variable_id(PowerSupplyComponent),
|
vol.Optional(CONF_POWER_SUPPLY): cv.use_variable_id(PowerSupplyComponent),
|
||||||
vol.Optional(CONF_EFFECTS): light.validate_effects(light.FASTLED_EFFECTS),
|
vol.Optional(CONF_EFFECTS): light.validate_effects(light.ADDRESSABLE_EFFECTS),
|
||||||
}).extend(cv.COMPONENT_SCHEMA.schema), validate)
|
}).extend(cv.COMPONENT_SCHEMA.schema), validate)
|
||||||
|
|
||||||
|
|
||||||
|
@ -103,6 +104,8 @@ def to_code(config):
|
||||||
|
|
||||||
BUILD_FLAGS = '-DUSE_FAST_LED_LIGHT'
|
BUILD_FLAGS = '-DUSE_FAST_LED_LIGHT'
|
||||||
|
|
||||||
|
LIB_DEPS = 'FastLED@3.2.0'
|
||||||
|
|
||||||
|
|
||||||
def to_hass_config(data, config):
|
def to_hass_config(data, config):
|
||||||
return light.core_to_hass_config(data, config, brightness=True, rgb=True, color_temp=False,
|
return light.core_to_hass_config(data, config, brightness=True, rgb=True, color_temp=False,
|
||||||
|
|
|
@ -1,14 +1,15 @@
|
||||||
import voluptuous as vol
|
import voluptuous as vol
|
||||||
|
|
||||||
import esphomeyaml.config_validation as cv
|
|
||||||
from esphomeyaml import pins
|
from esphomeyaml import pins
|
||||||
from esphomeyaml.components import light
|
from esphomeyaml.components import light
|
||||||
from esphomeyaml.components.power_supply import PowerSupplyComponent
|
from esphomeyaml.components.power_supply import PowerSupplyComponent
|
||||||
from esphomeyaml.const import CONF_CHIPSET, CONF_CLOCK_PIN, CONF_DATA_PIN, \
|
import esphomeyaml.config_validation as cv
|
||||||
CONF_DEFAULT_TRANSITION_LENGTH, CONF_GAMMA_CORRECT, CONF_MAKE_ID, CONF_MAX_REFRESH_RATE, \
|
from esphomeyaml.const import CONF_CHIPSET, CONF_CLOCK_PIN, CONF_COLOR_CORRECT, CONF_DATA_PIN, \
|
||||||
CONF_NAME, CONF_NUM_LEDS, CONF_POWER_SUPPLY, CONF_RGB_ORDER, CONF_EFFECTS, CONF_COLOR_CORRECT
|
CONF_DEFAULT_TRANSITION_LENGTH, CONF_EFFECTS, CONF_GAMMA_CORRECT, CONF_MAKE_ID, \
|
||||||
from esphomeyaml.helpers import App, Application, RawExpression, TemplateArguments, add, \
|
CONF_MAX_REFRESH_RATE, CONF_NAME, CONF_NUM_LEDS, CONF_POWER_SUPPLY, CONF_RGB_ORDER
|
||||||
get_variable, variable, setup_component
|
from esphomeyaml.cpp_generator import RawExpression, TemplateArguments, add, get_variable, variable
|
||||||
|
from esphomeyaml.cpp_helpers import setup_component
|
||||||
|
from esphomeyaml.cpp_types import App, Application
|
||||||
|
|
||||||
CHIPSETS = [
|
CHIPSETS = [
|
||||||
'LPD8806',
|
'LPD8806',
|
||||||
|
@ -35,19 +36,19 @@ MakeFastLEDLight = Application.struct('MakeFastLEDLight')
|
||||||
PLATFORM_SCHEMA = cv.nameable(light.LIGHT_PLATFORM_SCHEMA.extend({
|
PLATFORM_SCHEMA = cv.nameable(light.LIGHT_PLATFORM_SCHEMA.extend({
|
||||||
cv.GenerateID(CONF_MAKE_ID): cv.declare_variable_id(MakeFastLEDLight),
|
cv.GenerateID(CONF_MAKE_ID): cv.declare_variable_id(MakeFastLEDLight),
|
||||||
|
|
||||||
vol.Required(CONF_CHIPSET): vol.All(vol.Upper, cv.one_of(*CHIPSETS)),
|
vol.Required(CONF_CHIPSET): cv.one_of(*CHIPSETS, upper=True),
|
||||||
vol.Required(CONF_DATA_PIN): pins.output_pin,
|
vol.Required(CONF_DATA_PIN): pins.output_pin,
|
||||||
vol.Required(CONF_CLOCK_PIN): pins.output_pin,
|
vol.Required(CONF_CLOCK_PIN): pins.output_pin,
|
||||||
|
|
||||||
vol.Required(CONF_NUM_LEDS): cv.positive_not_null_int,
|
vol.Required(CONF_NUM_LEDS): cv.positive_not_null_int,
|
||||||
vol.Optional(CONF_RGB_ORDER): vol.All(vol.Upper, cv.one_of(*RGB_ORDERS)),
|
vol.Optional(CONF_RGB_ORDER): cv.one_of(*RGB_ORDERS, upper=True),
|
||||||
vol.Optional(CONF_MAX_REFRESH_RATE): cv.positive_time_period_microseconds,
|
vol.Optional(CONF_MAX_REFRESH_RATE): cv.positive_time_period_microseconds,
|
||||||
|
|
||||||
vol.Optional(CONF_GAMMA_CORRECT): cv.positive_float,
|
vol.Optional(CONF_GAMMA_CORRECT): cv.positive_float,
|
||||||
vol.Optional(CONF_COLOR_CORRECT): vol.All([cv.percentage], vol.Length(min=3, max=3)),
|
vol.Optional(CONF_COLOR_CORRECT): vol.All([cv.percentage], vol.Length(min=3, max=3)),
|
||||||
vol.Optional(CONF_DEFAULT_TRANSITION_LENGTH): cv.positive_time_period_milliseconds,
|
vol.Optional(CONF_DEFAULT_TRANSITION_LENGTH): cv.positive_time_period_milliseconds,
|
||||||
vol.Optional(CONF_POWER_SUPPLY): cv.use_variable_id(PowerSupplyComponent),
|
vol.Optional(CONF_POWER_SUPPLY): cv.use_variable_id(PowerSupplyComponent),
|
||||||
vol.Optional(CONF_EFFECTS): light.validate_effects(light.FASTLED_EFFECTS),
|
vol.Optional(CONF_EFFECTS): light.validate_effects(light.ADDRESSABLE_EFFECTS),
|
||||||
}).extend(cv.COMPONENT_SCHEMA.schema))
|
}).extend(cv.COMPONENT_SCHEMA.schema))
|
||||||
|
|
||||||
|
|
||||||
|
@ -83,6 +84,8 @@ def to_code(config):
|
||||||
|
|
||||||
BUILD_FLAGS = '-DUSE_FAST_LED_LIGHT'
|
BUILD_FLAGS = '-DUSE_FAST_LED_LIGHT'
|
||||||
|
|
||||||
|
LIB_DEPS = 'FastLED@3.2.0'
|
||||||
|
|
||||||
|
|
||||||
def to_hass_config(data, config):
|
def to_hass_config(data, config):
|
||||||
return light.core_to_hass_config(data, config, brightness=True, rgb=True, color_temp=False,
|
return light.core_to_hass_config(data, config, brightness=True, rgb=True, color_temp=False,
|
||||||
|
|
|
@ -4,7 +4,9 @@ from esphomeyaml.components import light, output
|
||||||
import esphomeyaml.config_validation as cv
|
import esphomeyaml.config_validation as cv
|
||||||
from esphomeyaml.const import CONF_DEFAULT_TRANSITION_LENGTH, CONF_EFFECTS, CONF_GAMMA_CORRECT, \
|
from esphomeyaml.const import CONF_DEFAULT_TRANSITION_LENGTH, CONF_EFFECTS, CONF_GAMMA_CORRECT, \
|
||||||
CONF_MAKE_ID, CONF_NAME, CONF_OUTPUT
|
CONF_MAKE_ID, CONF_NAME, CONF_OUTPUT
|
||||||
from esphomeyaml.helpers import App, get_variable, setup_component, variable
|
from esphomeyaml.cpp_generator import get_variable, variable
|
||||||
|
from esphomeyaml.cpp_helpers import setup_component
|
||||||
|
from esphomeyaml.cpp_types import App
|
||||||
|
|
||||||
PLATFORM_SCHEMA = cv.nameable(light.LIGHT_PLATFORM_SCHEMA.extend({
|
PLATFORM_SCHEMA = cv.nameable(light.LIGHT_PLATFORM_SCHEMA.extend({
|
||||||
cv.GenerateID(CONF_MAKE_ID): cv.declare_variable_id(light.MakeLight),
|
cv.GenerateID(CONF_MAKE_ID): cv.declare_variable_id(light.MakeLight),
|
||||||
|
|
170
esphomeyaml/components/light/neopixelbus.py
Normal file
|
@ -0,0 +1,170 @@
|
||||||
|
import voluptuous as vol
|
||||||
|
|
||||||
|
from esphomeyaml import pins
|
||||||
|
from esphomeyaml.components import light
|
||||||
|
from esphomeyaml.components.light import AddressableLight
|
||||||
|
from esphomeyaml.components.power_supply import PowerSupplyComponent
|
||||||
|
import esphomeyaml.config_validation as cv
|
||||||
|
from esphomeyaml.const import CONF_CLOCK_PIN, CONF_COLOR_CORRECT, CONF_DATA_PIN, \
|
||||||
|
CONF_DEFAULT_TRANSITION_LENGTH, CONF_EFFECTS, CONF_GAMMA_CORRECT, CONF_MAKE_ID, CONF_METHOD, \
|
||||||
|
CONF_NAME, CONF_NUM_LEDS, CONF_PIN, CONF_POWER_SUPPLY, CONF_TYPE, CONF_VARIANT
|
||||||
|
from esphomeyaml.core import CORE
|
||||||
|
from esphomeyaml.cpp_generator import TemplateArguments, add, get_variable, variable
|
||||||
|
from esphomeyaml.cpp_helpers import setup_component
|
||||||
|
from esphomeyaml.cpp_types import App, Application, Component, global_ns
|
||||||
|
|
||||||
|
NeoPixelBusLightOutputBase = light.light_ns.class_('NeoPixelBusLightOutputBase', Component,
|
||||||
|
AddressableLight)
|
||||||
|
ESPNeoPixelOrder = light.light_ns.namespace('ESPNeoPixelOrder')
|
||||||
|
|
||||||
|
|
||||||
|
def validate_type(value):
|
||||||
|
value = cv.string(value).upper()
|
||||||
|
if 'R' not in value:
|
||||||
|
raise vol.Invalid("Must have R in type")
|
||||||
|
if 'G' not in value:
|
||||||
|
raise vol.Invalid("Must have G in type")
|
||||||
|
if 'B' not in value:
|
||||||
|
raise vol.Invalid("Must have B in type")
|
||||||
|
rest = set(value) - set('RGBW')
|
||||||
|
if rest:
|
||||||
|
raise vol.Invalid("Type has invalid color: {}".format(', '.join(rest)))
|
||||||
|
if len(set(value)) != len(value):
|
||||||
|
raise vol.Invalid("Type has duplicate color!")
|
||||||
|
return value
|
||||||
|
|
||||||
|
|
||||||
|
def validate_variant(value):
|
||||||
|
value = cv.string(value).upper()
|
||||||
|
if value == 'WS2813':
|
||||||
|
value = 'WS2812X'
|
||||||
|
if value == 'WS2812':
|
||||||
|
value = '800KBPS'
|
||||||
|
if value == 'LC8812':
|
||||||
|
value = 'SK6812'
|
||||||
|
return cv.one_of(*VARIANTS)(value)
|
||||||
|
|
||||||
|
|
||||||
|
def validate_method(value):
|
||||||
|
if value is None:
|
||||||
|
if CORE.is_esp32:
|
||||||
|
return 'ESP32_I2S_1'
|
||||||
|
if CORE.is_esp8266:
|
||||||
|
return 'ESP8266_DMA'
|
||||||
|
raise NotImplementedError
|
||||||
|
|
||||||
|
if CORE.is_esp32:
|
||||||
|
return cv.one_of(*ESP32_METHODS, upper=True, space='_')(value)
|
||||||
|
if CORE.is_esp8266:
|
||||||
|
return cv.one_of(*ESP8266_METHODS, upper=True, space='_')(value)
|
||||||
|
raise NotImplementedError
|
||||||
|
|
||||||
|
|
||||||
|
VARIANTS = {
|
||||||
|
'WS2812X': 'Ws2812x',
|
||||||
|
'SK6812': 'Sk6812',
|
||||||
|
'800KBPS': '800Kbps',
|
||||||
|
'400KBPS': '400Kbps',
|
||||||
|
}
|
||||||
|
|
||||||
|
ESP8266_METHODS = {
|
||||||
|
'ESP8266_DMA': 'NeoEsp8266Dma{}Method',
|
||||||
|
'ESP8266_UART0': 'NeoEsp8266Uart0{}Method',
|
||||||
|
'ESP8266_UART1': 'NeoEsp8266Uart1{}Method',
|
||||||
|
'ESP8266_ASYNC_UART0': 'NeoEsp8266AsyncUart0{}Method',
|
||||||
|
'ESP8266_ASYNC_UART1': 'NeoEsp8266AsyncUart1{}Method',
|
||||||
|
'BIT_BANG': 'NeoEsp8266BitBang{}Method',
|
||||||
|
}
|
||||||
|
ESP32_METHODS = {
|
||||||
|
'ESP32_I2S_0': 'NeoEsp32I2s0{}Method',
|
||||||
|
'ESP32_I2S_1': 'NeoEsp32I2s1{}Method',
|
||||||
|
'BIT_BANG': 'NeoEsp32BitBang{}Method',
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def format_method(config):
|
||||||
|
variant = VARIANTS[config[CONF_VARIANT]]
|
||||||
|
method = config[CONF_METHOD]
|
||||||
|
if CORE.is_esp8266:
|
||||||
|
return ESP8266_METHODS[method].format(variant)
|
||||||
|
if CORE.is_esp32:
|
||||||
|
return ESP32_METHODS[method].format(variant)
|
||||||
|
raise NotImplementedError
|
||||||
|
|
||||||
|
|
||||||
|
def validate(config):
|
||||||
|
if CONF_PIN in config:
|
||||||
|
if CONF_CLOCK_PIN in config or CONF_DATA_PIN in config:
|
||||||
|
raise vol.Invalid("Cannot specify both 'pin' and 'clock_pin'+'data_pin'")
|
||||||
|
return config
|
||||||
|
if CONF_CLOCK_PIN in config:
|
||||||
|
if CONF_DATA_PIN not in config:
|
||||||
|
raise vol.Invalid("If you give clock_pin, you must also specify data_pin")
|
||||||
|
return config
|
||||||
|
raise vol.Invalid("Must specify at least one of 'pin' or 'clock_pin'+'data_pin'")
|
||||||
|
|
||||||
|
|
||||||
|
MakeNeoPixelBusLight = Application.struct('MakeNeoPixelBusLight')
|
||||||
|
|
||||||
|
PLATFORM_SCHEMA = cv.nameable(light.LIGHT_PLATFORM_SCHEMA.extend({
|
||||||
|
cv.GenerateID(CONF_MAKE_ID): cv.declare_variable_id(MakeNeoPixelBusLight),
|
||||||
|
|
||||||
|
vol.Optional(CONF_TYPE, default='GRB'): validate_type,
|
||||||
|
vol.Optional(CONF_VARIANT, default='800KBPS'): validate_variant,
|
||||||
|
vol.Optional(CONF_METHOD, default=None): validate_method,
|
||||||
|
vol.Optional(CONF_PIN): pins.output_pin,
|
||||||
|
vol.Optional(CONF_CLOCK_PIN): pins.output_pin,
|
||||||
|
vol.Optional(CONF_DATA_PIN): pins.output_pin,
|
||||||
|
|
||||||
|
vol.Required(CONF_NUM_LEDS): cv.positive_not_null_int,
|
||||||
|
|
||||||
|
vol.Optional(CONF_GAMMA_CORRECT): cv.positive_float,
|
||||||
|
vol.Optional(CONF_COLOR_CORRECT): vol.All([cv.percentage], vol.Length(min=3, max=4)),
|
||||||
|
vol.Optional(CONF_DEFAULT_TRANSITION_LENGTH): cv.positive_time_period_milliseconds,
|
||||||
|
vol.Optional(CONF_POWER_SUPPLY): cv.use_variable_id(PowerSupplyComponent),
|
||||||
|
vol.Optional(CONF_EFFECTS): light.validate_effects(light.ADDRESSABLE_EFFECTS),
|
||||||
|
}).extend(cv.COMPONENT_SCHEMA.schema), validate)
|
||||||
|
|
||||||
|
|
||||||
|
def to_code(config):
|
||||||
|
type_ = config[CONF_TYPE]
|
||||||
|
has_white = 'W' in type_
|
||||||
|
if has_white:
|
||||||
|
func = App.make_neo_pixel_bus_rgbw_light
|
||||||
|
color_feat = global_ns.NeoRgbwFeature
|
||||||
|
else:
|
||||||
|
func = App.make_neo_pixel_bus_rgb_light
|
||||||
|
color_feat = global_ns.NeoRgbFeature
|
||||||
|
|
||||||
|
template = TemplateArguments(getattr(global_ns, format_method(config)), color_feat)
|
||||||
|
rhs = func(template, config[CONF_NAME])
|
||||||
|
make = variable(config[CONF_MAKE_ID], rhs, type=MakeNeoPixelBusLight.template(template))
|
||||||
|
output = make.Poutput
|
||||||
|
|
||||||
|
if CONF_PIN in config:
|
||||||
|
add(output.add_leds(config[CONF_NUM_LEDS], config[CONF_PIN]))
|
||||||
|
else:
|
||||||
|
add(output.add_leds(config[CONF_NUM_LEDS], config[CONF_CLOCK_PIN], config[CONF_DATA_PIN]))
|
||||||
|
|
||||||
|
add(output.set_pixel_order(getattr(ESPNeoPixelOrder, type_)))
|
||||||
|
|
||||||
|
if CONF_POWER_SUPPLY in config:
|
||||||
|
for power_supply in get_variable(config[CONF_POWER_SUPPLY]):
|
||||||
|
yield
|
||||||
|
add(output.set_power_supply(power_supply))
|
||||||
|
|
||||||
|
if CONF_COLOR_CORRECT in config:
|
||||||
|
add(output.set_correction(*config[CONF_COLOR_CORRECT]))
|
||||||
|
|
||||||
|
light.setup_light(make.Pstate, make.Pmqtt, config)
|
||||||
|
setup_component(output, config)
|
||||||
|
|
||||||
|
|
||||||
|
BUILD_FLAGS = '-DUSE_NEO_PIXEL_BUS_LIGHT'
|
||||||
|
|
||||||
|
LIB_DEPS = 'NeoPixelBus@2.4.1'
|
||||||
|
|
||||||
|
|
||||||
|
def to_hass_config(data, config):
|
||||||
|
return light.core_to_hass_config(data, config, brightness=True, rgb=True, color_temp=False,
|
||||||
|
white_value='W' in config[CONF_TYPE])
|
|
@ -1,10 +1,12 @@
|
||||||
import voluptuous as vol
|
import voluptuous as vol
|
||||||
|
|
||||||
import esphomeyaml.config_validation as cv
|
|
||||||
from esphomeyaml.components import light, output
|
from esphomeyaml.components import light, output
|
||||||
from esphomeyaml.const import CONF_BLUE, CONF_DEFAULT_TRANSITION_LENGTH, CONF_GAMMA_CORRECT, \
|
import esphomeyaml.config_validation as cv
|
||||||
CONF_GREEN, CONF_MAKE_ID, CONF_NAME, CONF_RED, CONF_EFFECTS
|
from esphomeyaml.const import CONF_BLUE, CONF_DEFAULT_TRANSITION_LENGTH, CONF_EFFECTS, \
|
||||||
from esphomeyaml.helpers import App, get_variable, variable, setup_component
|
CONF_GAMMA_CORRECT, CONF_GREEN, CONF_MAKE_ID, CONF_NAME, CONF_RED
|
||||||
|
from esphomeyaml.cpp_generator import get_variable, variable
|
||||||
|
from esphomeyaml.cpp_helpers import setup_component
|
||||||
|
from esphomeyaml.cpp_types import App
|
||||||
|
|
||||||
PLATFORM_SCHEMA = cv.nameable(light.LIGHT_PLATFORM_SCHEMA.extend({
|
PLATFORM_SCHEMA = cv.nameable(light.LIGHT_PLATFORM_SCHEMA.extend({
|
||||||
cv.GenerateID(CONF_MAKE_ID): cv.declare_variable_id(light.MakeLight),
|
cv.GenerateID(CONF_MAKE_ID): cv.declare_variable_id(light.MakeLight),
|
||||||
|
|
|
@ -1,10 +1,12 @@
|
||||||
import voluptuous as vol
|
import voluptuous as vol
|
||||||
|
|
||||||
import esphomeyaml.config_validation as cv
|
|
||||||
from esphomeyaml.components import light, output
|
from esphomeyaml.components import light, output
|
||||||
from esphomeyaml.const import CONF_BLUE, CONF_DEFAULT_TRANSITION_LENGTH, CONF_GAMMA_CORRECT, \
|
import esphomeyaml.config_validation as cv
|
||||||
CONF_GREEN, CONF_MAKE_ID, CONF_NAME, CONF_RED, CONF_WHITE, CONF_EFFECTS
|
from esphomeyaml.const import CONF_BLUE, CONF_DEFAULT_TRANSITION_LENGTH, CONF_EFFECTS, \
|
||||||
from esphomeyaml.helpers import App, get_variable, variable, setup_component
|
CONF_GAMMA_CORRECT, CONF_GREEN, CONF_MAKE_ID, CONF_NAME, CONF_RED, CONF_WHITE
|
||||||
|
from esphomeyaml.cpp_generator import get_variable, variable
|
||||||
|
from esphomeyaml.cpp_helpers import setup_component
|
||||||
|
from esphomeyaml.cpp_types import App
|
||||||
|
|
||||||
PLATFORM_SCHEMA = cv.nameable(light.LIGHT_PLATFORM_SCHEMA.extend({
|
PLATFORM_SCHEMA = cv.nameable(light.LIGHT_PLATFORM_SCHEMA.extend({
|
||||||
cv.GenerateID(CONF_MAKE_ID): cv.declare_variable_id(light.MakeLight),
|
cv.GenerateID(CONF_MAKE_ID): cv.declare_variable_id(light.MakeLight),
|
||||||
|
|
|
@ -1,11 +1,13 @@
|
||||||
import voluptuous as vol
|
import voluptuous as vol
|
||||||
|
|
||||||
import esphomeyaml.config_validation as cv
|
|
||||||
from esphomeyaml.components import light, output
|
from esphomeyaml.components import light, output
|
||||||
|
import esphomeyaml.config_validation as cv
|
||||||
from esphomeyaml.const import CONF_BLUE, CONF_COLD_WHITE, CONF_COLD_WHITE_COLOR_TEMPERATURE, \
|
from esphomeyaml.const import CONF_BLUE, CONF_COLD_WHITE, CONF_COLD_WHITE_COLOR_TEMPERATURE, \
|
||||||
CONF_DEFAULT_TRANSITION_LENGTH, CONF_EFFECTS, CONF_GAMMA_CORRECT, CONF_GREEN, CONF_MAKE_ID, \
|
CONF_DEFAULT_TRANSITION_LENGTH, CONF_EFFECTS, CONF_GAMMA_CORRECT, CONF_GREEN, CONF_MAKE_ID, \
|
||||||
CONF_NAME, CONF_RED, CONF_WARM_WHITE, CONF_WARM_WHITE_COLOR_TEMPERATURE
|
CONF_NAME, CONF_RED, CONF_WARM_WHITE, CONF_WARM_WHITE_COLOR_TEMPERATURE
|
||||||
from esphomeyaml.helpers import App, get_variable, variable, setup_component
|
from esphomeyaml.cpp_generator import get_variable, variable
|
||||||
|
from esphomeyaml.cpp_helpers import setup_component
|
||||||
|
from esphomeyaml.cpp_types import App
|
||||||
|
|
||||||
|
|
||||||
def validate_color_temperature(value):
|
def validate_color_temperature(value):
|
||||||
|
|
|
@ -6,9 +6,11 @@ from esphomeyaml.automation import ACTION_REGISTRY, LambdaAction
|
||||||
import esphomeyaml.config_validation as cv
|
import esphomeyaml.config_validation as cv
|
||||||
from esphomeyaml.const import CONF_ARGS, CONF_BAUD_RATE, CONF_FORMAT, CONF_ID, CONF_LEVEL, \
|
from esphomeyaml.const import CONF_ARGS, CONF_BAUD_RATE, CONF_FORMAT, CONF_ID, CONF_LEVEL, \
|
||||||
CONF_LOGS, CONF_TAG, CONF_TX_BUFFER_SIZE
|
CONF_LOGS, CONF_TAG, CONF_TX_BUFFER_SIZE
|
||||||
from esphomeyaml.core import ESPHomeYAMLError, Lambda
|
from esphomeyaml.core import EsphomeyamlError, Lambda, CORE
|
||||||
from esphomeyaml.helpers import App, Pvariable, RawExpression, TemplateArguments, add, \
|
from esphomeyaml.cpp_generator import Pvariable, RawExpression, add, process_lambda, statement
|
||||||
esphomelib_ns, global_ns, process_lambda, statement, Component
|
from esphomeyaml.cpp_types import App, Component, esphomelib_ns, global_ns, void
|
||||||
|
|
||||||
|
from esphomeyaml.py_compat import text_type
|
||||||
|
|
||||||
LOG_LEVELS = {
|
LOG_LEVELS = {
|
||||||
'NONE': global_ns.ESPHOMELIB_LOG_LEVEL_NONE,
|
'NONE': global_ns.ESPHOMELIB_LOG_LEVEL_NONE,
|
||||||
|
@ -32,14 +34,14 @@ LOG_LEVEL_TO_ESP_LOG = {
|
||||||
LOG_LEVEL_SEVERITY = ['NONE', 'ERROR', 'WARN', 'INFO', 'DEBUG', 'VERBOSE', 'VERY_VERBOSE']
|
LOG_LEVEL_SEVERITY = ['NONE', 'ERROR', 'WARN', 'INFO', 'DEBUG', 'VERBOSE', 'VERY_VERBOSE']
|
||||||
|
|
||||||
# pylint: disable=invalid-name
|
# pylint: disable=invalid-name
|
||||||
is_log_level = vol.All(vol.Upper, cv.one_of(*LOG_LEVELS))
|
is_log_level = cv.one_of(*LOG_LEVELS, upper=True)
|
||||||
|
|
||||||
|
|
||||||
def validate_local_no_higher_than_global(value):
|
def validate_local_no_higher_than_global(value):
|
||||||
global_level = value.get(CONF_LEVEL, 'DEBUG')
|
global_level = value.get(CONF_LEVEL, 'DEBUG')
|
||||||
for tag, level in value.get(CONF_LOGS, {}).iteritems():
|
for tag, level in value.get(CONF_LOGS, {}).items():
|
||||||
if LOG_LEVEL_SEVERITY.index(level) > LOG_LEVEL_SEVERITY.index(global_level):
|
if LOG_LEVEL_SEVERITY.index(level) > LOG_LEVEL_SEVERITY.index(global_level):
|
||||||
raise ESPHomeYAMLError(u"The local log level {} for {} must be less severe than the "
|
raise EsphomeyamlError(u"The local log level {} for {} must be less severe than the "
|
||||||
u"global log level {}.".format(level, tag, global_level))
|
u"global log level {}.".format(level, tag, global_level))
|
||||||
return value
|
return value
|
||||||
|
|
||||||
|
@ -64,14 +66,37 @@ def to_code(config):
|
||||||
add(log.set_tx_buffer_size(config[CONF_TX_BUFFER_SIZE]))
|
add(log.set_tx_buffer_size(config[CONF_TX_BUFFER_SIZE]))
|
||||||
if CONF_LEVEL in config:
|
if CONF_LEVEL in config:
|
||||||
add(log.set_global_log_level(LOG_LEVELS[config[CONF_LEVEL]]))
|
add(log.set_global_log_level(LOG_LEVELS[config[CONF_LEVEL]]))
|
||||||
for tag, level in config.get(CONF_LOGS, {}).iteritems():
|
for tag, level in config.get(CONF_LOGS, {}).items():
|
||||||
add(log.set_log_level(tag, LOG_LEVELS[level]))
|
add(log.set_log_level(tag, LOG_LEVELS[level]))
|
||||||
|
|
||||||
|
|
||||||
def required_build_flags(config):
|
def required_build_flags(config):
|
||||||
|
flags = []
|
||||||
if CONF_LEVEL in config:
|
if CONF_LEVEL in config:
|
||||||
return u'-DESPHOMELIB_LOG_LEVEL={}'.format(str(LOG_LEVELS[config[CONF_LEVEL]]))
|
flags.append(u'-DESPHOMELIB_LOG_LEVEL={}'.format(str(LOG_LEVELS[config[CONF_LEVEL]])))
|
||||||
return None
|
this_severity = LOG_LEVEL_SEVERITY.index(config[CONF_LEVEL])
|
||||||
|
verbose_severity = LOG_LEVEL_SEVERITY.index('VERBOSE')
|
||||||
|
is_at_least_verbose = this_severity >= verbose_severity
|
||||||
|
has_serial_logging = config.get(CONF_BAUD_RATE) != 0
|
||||||
|
if CORE.is_esp8266 and has_serial_logging and is_at_least_verbose:
|
||||||
|
flags.append(u"-DDEBUG_ESP_PORT=Serial")
|
||||||
|
flags.append(u"-DLWIP_DEBUG")
|
||||||
|
DEBUG_COMPONENTS = {
|
||||||
|
'HTTP_CLIENT',
|
||||||
|
'HTTP_SERVER',
|
||||||
|
'HTTP_UPDATE',
|
||||||
|
'OTA',
|
||||||
|
'SSL',
|
||||||
|
'TLS_MEM',
|
||||||
|
'UPDATER',
|
||||||
|
'WIFI',
|
||||||
|
}
|
||||||
|
for comp in DEBUG_COMPONENTS:
|
||||||
|
flags.append(u"-DDEBUG_ESP_{}".format(comp))
|
||||||
|
if CORE.is_esp32 and is_at_least_verbose:
|
||||||
|
flags.append('-DCORE_DEBUG_LEVEL=5')
|
||||||
|
|
||||||
|
return flags
|
||||||
|
|
||||||
|
|
||||||
def maybe_simple_message(schema):
|
def maybe_simple_message(schema):
|
||||||
|
@ -97,7 +122,7 @@ def validate_printf(value):
|
||||||
[cCdiouxXeEfgGaAnpsSZ] # type
|
[cCdiouxXeEfgGaAnpsSZ] # type
|
||||||
) | # OR
|
) | # OR
|
||||||
%%) # literal "%%"
|
%%) # literal "%%"
|
||||||
"""
|
""" # noqa
|
||||||
matches = re.findall(cfmt, value[CONF_FORMAT], flags=re.X)
|
matches = re.findall(cfmt, value[CONF_FORMAT], flags=re.X)
|
||||||
if len(matches) != len(value[CONF_ARGS]):
|
if len(matches) != len(value[CONF_ARGS]):
|
||||||
raise vol.Invalid(u"Found {} printf-patterns ({}), but {} args were given!"
|
raise vol.Invalid(u"Found {} printf-patterns ({}), but {} args were given!"
|
||||||
|
@ -108,21 +133,20 @@ def validate_printf(value):
|
||||||
CONF_LOGGER_LOG = 'logger.log'
|
CONF_LOGGER_LOG = 'logger.log'
|
||||||
LOGGER_LOG_ACTION_SCHEMA = vol.All(maybe_simple_message({
|
LOGGER_LOG_ACTION_SCHEMA = vol.All(maybe_simple_message({
|
||||||
vol.Required(CONF_FORMAT): cv.string,
|
vol.Required(CONF_FORMAT): cv.string,
|
||||||
vol.Optional(CONF_ARGS, default=list): vol.All(cv.ensure_list, [cv.lambda_]),
|
vol.Optional(CONF_ARGS, default=list): cv.ensure_list(cv.lambda_),
|
||||||
vol.Optional(CONF_LEVEL, default="DEBUG"): vol.All(vol.Upper, cv.one_of(*LOG_LEVEL_TO_ESP_LOG)),
|
vol.Optional(CONF_LEVEL, default="DEBUG"): cv.one_of(*LOG_LEVEL_TO_ESP_LOG, upper=True),
|
||||||
vol.Optional(CONF_TAG, default="main"): cv.string,
|
vol.Optional(CONF_TAG, default="main"): cv.string,
|
||||||
}), validate_printf)
|
}), validate_printf)
|
||||||
|
|
||||||
|
|
||||||
@ACTION_REGISTRY.register(CONF_LOGGER_LOG, LOGGER_LOG_ACTION_SCHEMA)
|
@ACTION_REGISTRY.register(CONF_LOGGER_LOG, LOGGER_LOG_ACTION_SCHEMA)
|
||||||
def logger_log_action_to_code(config, action_id, arg_type):
|
def logger_log_action_to_code(config, action_id, arg_type, template_arg):
|
||||||
template_arg = TemplateArguments(arg_type)
|
|
||||||
esp_log = LOG_LEVEL_TO_ESP_LOG[config[CONF_LEVEL]]
|
esp_log = LOG_LEVEL_TO_ESP_LOG[config[CONF_LEVEL]]
|
||||||
args = [RawExpression(unicode(x)) for x in config[CONF_ARGS]]
|
args = [RawExpression(text_type(x)) for x in config[CONF_ARGS]]
|
||||||
|
|
||||||
text = unicode(statement(esp_log(config[CONF_TAG], config[CONF_FORMAT], *args)))
|
text = text_type(statement(esp_log(config[CONF_TAG], config[CONF_FORMAT], *args)))
|
||||||
|
|
||||||
for lambda_ in process_lambda(Lambda(text), [(arg_type, 'x')]):
|
for lambda_ in process_lambda(Lambda(text), [(arg_type, 'x')], return_type=void):
|
||||||
yield None
|
yield None
|
||||||
rhs = LambdaAction.new(template_arg, lambda_)
|
rhs = LambdaAction.new(template_arg, lambda_)
|
||||||
type = LambdaAction.template(template_arg)
|
type = LambdaAction.template(template_arg)
|
||||||
|
|
|
@ -7,17 +7,18 @@ from esphomeyaml import automation
|
||||||
from esphomeyaml.automation import ACTION_REGISTRY
|
from esphomeyaml.automation import ACTION_REGISTRY
|
||||||
from esphomeyaml.components import logger
|
from esphomeyaml.components import logger
|
||||||
import esphomeyaml.config_validation as cv
|
import esphomeyaml.config_validation as cv
|
||||||
from esphomeyaml.const import CONF_BIRTH_MESSAGE, CONF_BROKER, CONF_CLIENT_ID, CONF_DISCOVERY, \
|
from esphomeyaml.const import CONF_AVAILABILITY, CONF_BIRTH_MESSAGE, CONF_BROKER, CONF_CLIENT_ID, \
|
||||||
CONF_DISCOVERY_PREFIX, CONF_DISCOVERY_RETAIN, CONF_ID, CONF_KEEPALIVE, CONF_LEVEL, \
|
CONF_COMMAND_TOPIC, CONF_DISCOVERY, CONF_DISCOVERY_PREFIX, CONF_DISCOVERY_RETAIN, \
|
||||||
CONF_LOG_TOPIC, CONF_ON_MESSAGE, CONF_PASSWORD, CONF_PAYLOAD, CONF_PORT, CONF_QOS, \
|
CONF_ESPHOMEYAML, CONF_ID, CONF_INTERNAL, CONF_KEEPALIVE, CONF_LEVEL, CONF_LOG_TOPIC, \
|
||||||
CONF_REBOOT_TIMEOUT, CONF_RETAIN, CONF_SHUTDOWN_MESSAGE, CONF_SSL_FINGERPRINTS, CONF_TOPIC, \
|
CONF_MQTT, CONF_NAME, CONF_ON_JSON_MESSAGE, CONF_ON_MESSAGE, CONF_PASSWORD, CONF_PAYLOAD, \
|
||||||
CONF_TOPIC_PREFIX, CONF_TRIGGER_ID, CONF_USERNAME, CONF_WILL_MESSAGE, CONF_ON_JSON_MESSAGE, \
|
CONF_PAYLOAD_AVAILABLE, CONF_PAYLOAD_NOT_AVAILABLE, CONF_PORT, CONF_QOS, CONF_REBOOT_TIMEOUT, \
|
||||||
CONF_STATE_TOPIC, CONF_MQTT, CONF_ESPHOMEYAML, CONF_NAME, CONF_AVAILABILITY, \
|
CONF_RETAIN, CONF_SHUTDOWN_MESSAGE, CONF_SSL_FINGERPRINTS, CONF_STATE_TOPIC, CONF_TOPIC, \
|
||||||
CONF_PAYLOAD_AVAILABLE, CONF_PAYLOAD_NOT_AVAILABLE, CONF_INTERNAL
|
CONF_TOPIC_PREFIX, CONF_TRIGGER_ID, CONF_USERNAME, CONF_WILL_MESSAGE
|
||||||
from esphomeyaml.core import ESPHomeYAMLError
|
from esphomeyaml.core import EsphomeyamlError
|
||||||
from esphomeyaml.helpers import App, ArrayInitializer, Pvariable, RawExpression, \
|
from esphomeyaml.cpp_generator import ArrayInitializer, Pvariable, RawExpression, \
|
||||||
StructInitializer, TemplateArguments, add, esphomelib_ns, optional, std_string, templatable, \
|
StructInitializer, TemplateArguments, add, process_lambda, templatable
|
||||||
uint8, bool_, JsonObjectRef, process_lambda, JsonObjectConstRef, Component, Action, Trigger
|
from esphomeyaml.cpp_types import Action, App, Component, JsonObjectConstRef, JsonObjectRef, \
|
||||||
|
Trigger, bool_, esphomelib_ns, optional, std_string, uint8, void
|
||||||
|
|
||||||
|
|
||||||
def validate_message_just_topic(value):
|
def validate_message_just_topic(value):
|
||||||
|
@ -48,12 +49,14 @@ MQTTJsonMessageTrigger = mqtt_ns.class_('MQTTJsonMessageTrigger',
|
||||||
MQTTComponent = mqtt_ns.class_('MQTTComponent', Component)
|
MQTTComponent = mqtt_ns.class_('MQTTComponent', Component)
|
||||||
|
|
||||||
|
|
||||||
def validate_broker(value):
|
def validate_config(value):
|
||||||
value = cv.string_strict(value)
|
if CONF_PORT not in value:
|
||||||
if u':' in value:
|
parts = value[CONF_BROKER].split(u':')
|
||||||
raise vol.Invalid(u"Please specify the port using the port: option")
|
if len(parts) == 2:
|
||||||
if not value:
|
value[CONF_BROKER] = parts[0]
|
||||||
raise vol.Invalid(u"Broker cannot be empty")
|
value[CONF_PORT] = cv.port(parts[1])
|
||||||
|
else:
|
||||||
|
value[CONF_PORT] = 1883
|
||||||
return value
|
return value
|
||||||
|
|
||||||
|
|
||||||
|
@ -64,14 +67,14 @@ def validate_fingerprint(value):
|
||||||
return value
|
return value
|
||||||
|
|
||||||
|
|
||||||
CONFIG_SCHEMA = vol.Schema({
|
CONFIG_SCHEMA = vol.All(vol.Schema({
|
||||||
cv.GenerateID(): cv.declare_variable_id(MQTTClientComponent),
|
cv.GenerateID(): cv.declare_variable_id(MQTTClientComponent),
|
||||||
vol.Required(CONF_BROKER): validate_broker,
|
vol.Required(CONF_BROKER): cv.string_strict,
|
||||||
vol.Optional(CONF_PORT, default=1883): cv.port,
|
vol.Optional(CONF_PORT): cv.port,
|
||||||
vol.Optional(CONF_USERNAME, default=''): cv.string,
|
vol.Optional(CONF_USERNAME, default=''): cv.string,
|
||||||
vol.Optional(CONF_PASSWORD, default=''): cv.string,
|
vol.Optional(CONF_PASSWORD, default=''): cv.string,
|
||||||
vol.Optional(CONF_CLIENT_ID): vol.All(cv.string, vol.Length(max=23)),
|
vol.Optional(CONF_CLIENT_ID): vol.All(cv.string, vol.Length(max=23)),
|
||||||
vol.Optional(CONF_DISCOVERY): cv.boolean,
|
vol.Optional(CONF_DISCOVERY): vol.Any(cv.boolean, cv.one_of("CLEAN", upper=True)),
|
||||||
vol.Optional(CONF_DISCOVERY_RETAIN): cv.boolean,
|
vol.Optional(CONF_DISCOVERY_RETAIN): cv.boolean,
|
||||||
vol.Optional(CONF_DISCOVERY_PREFIX): cv.publish_topic,
|
vol.Optional(CONF_DISCOVERY_PREFIX): cv.publish_topic,
|
||||||
vol.Optional(CONF_BIRTH_MESSAGE): MQTT_MESSAGE_SCHEMA,
|
vol.Optional(CONF_BIRTH_MESSAGE): MQTT_MESSAGE_SCHEMA,
|
||||||
|
@ -82,20 +85,21 @@ CONFIG_SCHEMA = vol.Schema({
|
||||||
vol.Optional(CONF_LEVEL): logger.is_log_level,
|
vol.Optional(CONF_LEVEL): logger.is_log_level,
|
||||||
}), validate_message_just_topic),
|
}), validate_message_just_topic),
|
||||||
vol.Optional(CONF_SSL_FINGERPRINTS): vol.All(cv.only_on_esp8266,
|
vol.Optional(CONF_SSL_FINGERPRINTS): vol.All(cv.only_on_esp8266,
|
||||||
cv.ensure_list, [validate_fingerprint]),
|
cv.ensure_list(validate_fingerprint)),
|
||||||
vol.Optional(CONF_KEEPALIVE): cv.positive_time_period_seconds,
|
vol.Optional(CONF_KEEPALIVE): cv.positive_time_period_seconds,
|
||||||
vol.Optional(CONF_REBOOT_TIMEOUT): cv.positive_time_period_milliseconds,
|
vol.Optional(CONF_REBOOT_TIMEOUT): cv.positive_time_period_milliseconds,
|
||||||
vol.Optional(CONF_ON_MESSAGE): automation.validate_automation({
|
vol.Optional(CONF_ON_MESSAGE): automation.validate_automation({
|
||||||
cv.GenerateID(CONF_TRIGGER_ID): cv.declare_variable_id(MQTTMessageTrigger),
|
cv.GenerateID(CONF_TRIGGER_ID): cv.declare_variable_id(MQTTMessageTrigger),
|
||||||
vol.Required(CONF_TOPIC): cv.subscribe_topic,
|
vol.Required(CONF_TOPIC): cv.subscribe_topic,
|
||||||
vol.Optional(CONF_QOS, default=0): cv.mqtt_qos,
|
vol.Optional(CONF_QOS): cv.mqtt_qos,
|
||||||
|
vol.Optional(CONF_PAYLOAD): cv.string_strict,
|
||||||
}),
|
}),
|
||||||
vol.Optional(CONF_ON_JSON_MESSAGE): automation.validate_automation({
|
vol.Optional(CONF_ON_JSON_MESSAGE): automation.validate_automation({
|
||||||
cv.GenerateID(CONF_TRIGGER_ID): cv.declare_variable_id(MQTTJsonMessageTrigger),
|
cv.GenerateID(CONF_TRIGGER_ID): cv.declare_variable_id(MQTTJsonMessageTrigger),
|
||||||
vol.Required(CONF_TOPIC): cv.subscribe_topic,
|
vol.Required(CONF_TOPIC): cv.subscribe_topic,
|
||||||
vol.Optional(CONF_QOS, default=0): cv.mqtt_qos,
|
vol.Optional(CONF_QOS, default=0): cv.mqtt_qos,
|
||||||
}),
|
}),
|
||||||
})
|
}), validate_config)
|
||||||
|
|
||||||
|
|
||||||
def exp_mqtt_message(config):
|
def exp_mqtt_message(config):
|
||||||
|
@ -116,12 +120,17 @@ def to_code(config):
|
||||||
config[CONF_USERNAME], config[CONF_PASSWORD])
|
config[CONF_USERNAME], config[CONF_PASSWORD])
|
||||||
mqtt = Pvariable(config[CONF_ID], rhs)
|
mqtt = Pvariable(config[CONF_ID], rhs)
|
||||||
|
|
||||||
if not config.get(CONF_DISCOVERY, True):
|
discovery = config.get(CONF_DISCOVERY, True)
|
||||||
add(mqtt.disable_discovery())
|
|
||||||
elif CONF_DISCOVERY_RETAIN in config or CONF_DISCOVERY_PREFIX in config:
|
|
||||||
discovery_retain = config.get(CONF_DISCOVERY_RETAIN, True)
|
discovery_retain = config.get(CONF_DISCOVERY_RETAIN, True)
|
||||||
discovery_prefix = config.get(CONF_DISCOVERY_PREFIX, 'homeassistant')
|
discovery_prefix = config.get(CONF_DISCOVERY_PREFIX, 'homeassistant')
|
||||||
|
|
||||||
|
if not discovery:
|
||||||
|
add(mqtt.disable_discovery())
|
||||||
|
elif discovery == "CLEAN":
|
||||||
|
add(mqtt.set_discovery_info(discovery_prefix, discovery_retain, True))
|
||||||
|
elif CONF_DISCOVERY_RETAIN in config or CONF_DISCOVERY_PREFIX in config:
|
||||||
add(mqtt.set_discovery_info(discovery_prefix, discovery_retain))
|
add(mqtt.set_discovery_info(discovery_prefix, discovery_retain))
|
||||||
|
|
||||||
if CONF_TOPIC_PREFIX in config:
|
if CONF_TOPIC_PREFIX in config:
|
||||||
add(mqtt.set_topic_prefix(config[CONF_TOPIC_PREFIX]))
|
add(mqtt.set_topic_prefix(config[CONF_TOPIC_PREFIX]))
|
||||||
|
|
||||||
|
@ -169,8 +178,12 @@ def to_code(config):
|
||||||
add(mqtt.set_reboot_timeout(config[CONF_REBOOT_TIMEOUT]))
|
add(mqtt.set_reboot_timeout(config[CONF_REBOOT_TIMEOUT]))
|
||||||
|
|
||||||
for conf in config.get(CONF_ON_MESSAGE, []):
|
for conf in config.get(CONF_ON_MESSAGE, []):
|
||||||
rhs = mqtt.make_message_trigger(conf[CONF_TOPIC], conf[CONF_QOS])
|
rhs = App.register_component(mqtt.make_message_trigger(conf[CONF_TOPIC]))
|
||||||
trigger = Pvariable(conf[CONF_TRIGGER_ID], rhs)
|
trigger = Pvariable(conf[CONF_TRIGGER_ID], rhs)
|
||||||
|
if CONF_QOS in conf:
|
||||||
|
add(trigger.set_qos(conf[CONF_QOS]))
|
||||||
|
if CONF_PAYLOAD in conf:
|
||||||
|
add(trigger.set_payload(conf[CONF_PAYLOAD]))
|
||||||
automation.build_automation(trigger, std_string, conf)
|
automation.build_automation(trigger, std_string, conf)
|
||||||
|
|
||||||
for conf in config.get(CONF_ON_JSON_MESSAGE, []):
|
for conf in config.get(CONF_ON_JSON_MESSAGE, []):
|
||||||
|
@ -189,8 +202,7 @@ MQTT_PUBLISH_ACTION_SCHEMA = vol.Schema({
|
||||||
|
|
||||||
|
|
||||||
@ACTION_REGISTRY.register(CONF_MQTT_PUBLISH, MQTT_PUBLISH_ACTION_SCHEMA)
|
@ACTION_REGISTRY.register(CONF_MQTT_PUBLISH, MQTT_PUBLISH_ACTION_SCHEMA)
|
||||||
def mqtt_publish_action_to_code(config, action_id, arg_type):
|
def mqtt_publish_action_to_code(config, action_id, arg_type, template_arg):
|
||||||
template_arg = TemplateArguments(arg_type)
|
|
||||||
rhs = App.Pget_mqtt_client().Pmake_publish_action(template_arg)
|
rhs = App.Pget_mqtt_client().Pmake_publish_action(template_arg)
|
||||||
type = MQTTPublishAction.template(template_arg)
|
type = MQTTPublishAction.template(template_arg)
|
||||||
action = Pvariable(action_id, rhs, type=type)
|
action = Pvariable(action_id, rhs, type=type)
|
||||||
|
@ -222,8 +234,7 @@ MQTT_PUBLISH_JSON_ACTION_SCHEMA = vol.Schema({
|
||||||
|
|
||||||
|
|
||||||
@ACTION_REGISTRY.register(CONF_MQTT_PUBLISH_JSON, MQTT_PUBLISH_JSON_ACTION_SCHEMA)
|
@ACTION_REGISTRY.register(CONF_MQTT_PUBLISH_JSON, MQTT_PUBLISH_JSON_ACTION_SCHEMA)
|
||||||
def mqtt_publish_json_action_to_code(config, action_id, arg_type):
|
def mqtt_publish_json_action_to_code(config, action_id, arg_type, template_arg):
|
||||||
template_arg = TemplateArguments(arg_type)
|
|
||||||
rhs = App.Pget_mqtt_client().Pmake_publish_json_action(template_arg)
|
rhs = App.Pget_mqtt_client().Pmake_publish_json_action(template_arg)
|
||||||
type = MQTTPublishJsonAction.template(template_arg)
|
type = MQTTPublishJsonAction.template(template_arg)
|
||||||
action = Pvariable(action_id, rhs, type=type)
|
action = Pvariable(action_id, rhs, type=type)
|
||||||
|
@ -231,7 +242,8 @@ def mqtt_publish_json_action_to_code(config, action_id, arg_type):
|
||||||
yield None
|
yield None
|
||||||
add(action.set_topic(template_))
|
add(action.set_topic(template_))
|
||||||
|
|
||||||
for lambda_ in process_lambda(config[CONF_PAYLOAD], [(arg_type, 'x'), (JsonObjectRef, 'root')]):
|
for lambda_ in process_lambda(config[CONF_PAYLOAD], [(arg_type, 'x'), (JsonObjectRef, 'root')],
|
||||||
|
return_type=void):
|
||||||
yield None
|
yield None
|
||||||
add(action.set_payload(lambda_))
|
add(action.set_payload(lambda_))
|
||||||
if CONF_QOS in config:
|
if CONF_QOS in config:
|
||||||
|
@ -254,12 +266,11 @@ def get_default_topic_for(data, component_type, name, suffix):
|
||||||
sanitized_name, suffix)
|
sanitized_name, suffix)
|
||||||
|
|
||||||
|
|
||||||
def build_hass_config(data, component_type, config, include_state=True, include_command=True,
|
def build_hass_config(data, component_type, config, include_state=True, include_command=True):
|
||||||
platform='mqtt'):
|
|
||||||
if config.get(CONF_INTERNAL, False):
|
if config.get(CONF_INTERNAL, False):
|
||||||
return None
|
return None
|
||||||
ret = OrderedDict()
|
ret = OrderedDict()
|
||||||
ret['platform'] = platform
|
ret['platform'] = 'mqtt'
|
||||||
ret['name'] = config[CONF_NAME]
|
ret['name'] = config[CONF_NAME]
|
||||||
if include_state:
|
if include_state:
|
||||||
default = get_default_topic_for(data, component_type, config[CONF_NAME], 'state')
|
default = get_default_topic_for(data, component_type, config[CONF_NAME], 'state')
|
||||||
|
@ -282,7 +293,7 @@ def build_hass_config(data, component_type, config, include_state=True, include_
|
||||||
class GenerateHassConfigData(object):
|
class GenerateHassConfigData(object):
|
||||||
def __init__(self, config):
|
def __init__(self, config):
|
||||||
if 'mqtt' not in config:
|
if 'mqtt' not in config:
|
||||||
raise ESPHomeYAMLError("Cannot generate Home Assistant MQTT config if MQTT is not "
|
raise EsphomeyamlError("Cannot generate Home Assistant MQTT config if MQTT is not "
|
||||||
"used!")
|
"used!")
|
||||||
mqtt = config[CONF_MQTT]
|
mqtt = config[CONF_MQTT]
|
||||||
self.topic_prefix = mqtt.get(CONF_TOPIC_PREFIX, config[CONF_ESPHOMEYAML][CONF_NAME])
|
self.topic_prefix = mqtt.get(CONF_TOPIC_PREFIX, config[CONF_ESPHOMEYAML][CONF_NAME])
|
||||||
|
@ -308,3 +319,21 @@ class GenerateHassConfigData(object):
|
||||||
CONF_PAYLOAD_AVAILABLE: birth_message[CONF_PAYLOAD],
|
CONF_PAYLOAD_AVAILABLE: birth_message[CONF_PAYLOAD],
|
||||||
CONF_PAYLOAD_NOT_AVAILABLE: will_message[CONF_PAYLOAD],
|
CONF_PAYLOAD_NOT_AVAILABLE: will_message[CONF_PAYLOAD],
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def setup_mqtt_component(obj, config):
|
||||||
|
if CONF_RETAIN in config:
|
||||||
|
add(obj.set_retain(config[CONF_RETAIN]))
|
||||||
|
if not config.get(CONF_DISCOVERY, True):
|
||||||
|
add(obj.disable_discovery())
|
||||||
|
if CONF_STATE_TOPIC in config:
|
||||||
|
add(obj.set_custom_state_topic(config[CONF_STATE_TOPIC]))
|
||||||
|
if CONF_COMMAND_TOPIC in config:
|
||||||
|
add(obj.set_custom_command_topic(config[CONF_COMMAND_TOPIC]))
|
||||||
|
if CONF_AVAILABILITY in config:
|
||||||
|
availability = config[CONF_AVAILABILITY]
|
||||||
|
if not availability:
|
||||||
|
add(obj.disable_availability())
|
||||||
|
else:
|
||||||
|
add(obj.set_availability(availability[CONF_TOPIC], availability[CONF_PAYLOAD_AVAILABLE],
|
||||||
|
availability[CONF_PAYLOAD_NOT_AVAILABLE]))
|
||||||
|
|
|
@ -1,18 +1,18 @@
|
||||||
import voluptuous as vol
|
import voluptuous as vol
|
||||||
|
|
||||||
import esphomeyaml.config_validation as cv
|
|
||||||
from esphomeyaml import pins
|
from esphomeyaml import pins
|
||||||
from esphomeyaml.components import output
|
from esphomeyaml.components import output
|
||||||
from esphomeyaml.const import (CONF_DATA_PIN, CONF_CLOCK_PIN, CONF_NUM_CHANNELS,
|
import esphomeyaml.config_validation as cv
|
||||||
CONF_NUM_CHIPS, CONF_BIT_DEPTH, CONF_ID,
|
from esphomeyaml.const import (CONF_BIT_DEPTH, CONF_CLOCK_PIN, CONF_DATA_PIN, CONF_ID,
|
||||||
CONF_UPDATE_ON_BOOT)
|
CONF_NUM_CHANNELS, CONF_NUM_CHIPS, CONF_UPDATE_ON_BOOT)
|
||||||
from esphomeyaml.helpers import (gpio_output_pin_expression, App, Pvariable,
|
from esphomeyaml.cpp_generator import Pvariable, add
|
||||||
add, setup_component, Component)
|
from esphomeyaml.cpp_helpers import gpio_output_pin_expression, setup_component
|
||||||
|
from esphomeyaml.cpp_types import App, Component
|
||||||
|
|
||||||
MY9231OutputComponent = output.output_ns.class_('MY9231OutputComponent', Component)
|
MY9231OutputComponent = output.output_ns.class_('MY9231OutputComponent', Component)
|
||||||
|
MULTI_CONF = True
|
||||||
|
|
||||||
|
CONFIG_SCHEMA = vol.Schema({
|
||||||
MY9231_SCHEMA = vol.Schema({
|
|
||||||
cv.GenerateID(): cv.declare_variable_id(MY9231OutputComponent),
|
cv.GenerateID(): cv.declare_variable_id(MY9231OutputComponent),
|
||||||
vol.Required(CONF_DATA_PIN): pins.gpio_output_pin_schema,
|
vol.Required(CONF_DATA_PIN): pins.gpio_output_pin_schema,
|
||||||
vol.Required(CONF_CLOCK_PIN): pins.gpio_output_pin_schema,
|
vol.Required(CONF_CLOCK_PIN): pins.gpio_output_pin_schema,
|
||||||
|
@ -20,33 +20,27 @@ MY9231_SCHEMA = vol.Schema({
|
||||||
vol.Range(3, 1020)),
|
vol.Range(3, 1020)),
|
||||||
vol.Optional(CONF_NUM_CHIPS): vol.All(vol.Coerce(int),
|
vol.Optional(CONF_NUM_CHIPS): vol.All(vol.Coerce(int),
|
||||||
vol.Range(1, 255)),
|
vol.Range(1, 255)),
|
||||||
vol.Optional(CONF_BIT_DEPTH): vol.All(vol.Coerce(int),
|
vol.Optional(CONF_BIT_DEPTH): cv.one_of(8, 12, 14, 16, int=True),
|
||||||
cv.one_of(8, 12, 14, 16)),
|
|
||||||
vol.Optional(CONF_UPDATE_ON_BOOT): vol.Coerce(bool),
|
vol.Optional(CONF_UPDATE_ON_BOOT): vol.Coerce(bool),
|
||||||
}).extend(cv.COMPONENT_SCHEMA.schema)
|
}).extend(cv.COMPONENT_SCHEMA.schema)
|
||||||
|
|
||||||
CONFIG_SCHEMA = vol.All(cv.ensure_list, [MY9231_SCHEMA])
|
|
||||||
|
|
||||||
|
|
||||||
def to_code(config):
|
def to_code(config):
|
||||||
for conf in config:
|
for di in gpio_output_pin_expression(config[CONF_DATA_PIN]):
|
||||||
di = None
|
|
||||||
for di in gpio_output_pin_expression(conf[CONF_DATA_PIN]):
|
|
||||||
yield
|
yield
|
||||||
dcki = None
|
for dcki in gpio_output_pin_expression(config[CONF_CLOCK_PIN]):
|
||||||
for dcki in gpio_output_pin_expression(conf[CONF_CLOCK_PIN]):
|
|
||||||
yield
|
yield
|
||||||
rhs = App.make_my9231_component(di, dcki)
|
rhs = App.make_my9231_component(di, dcki)
|
||||||
my9231 = Pvariable(conf[CONF_ID], rhs)
|
my9231 = Pvariable(config[CONF_ID], rhs)
|
||||||
if CONF_NUM_CHANNELS in conf:
|
if CONF_NUM_CHANNELS in config:
|
||||||
add(my9231.set_num_channels(conf[CONF_NUM_CHANNELS]))
|
add(my9231.set_num_channels(config[CONF_NUM_CHANNELS]))
|
||||||
if CONF_NUM_CHIPS in conf:
|
if CONF_NUM_CHIPS in config:
|
||||||
add(my9231.set_num_chips(conf[CONF_NUM_CHIPS]))
|
add(my9231.set_num_chips(config[CONF_NUM_CHIPS]))
|
||||||
if CONF_BIT_DEPTH in conf:
|
if CONF_BIT_DEPTH in config:
|
||||||
add(my9231.set_bit_depth(conf[CONF_BIT_DEPTH]))
|
add(my9231.set_bit_depth(config[CONF_BIT_DEPTH]))
|
||||||
if CONF_UPDATE_ON_BOOT in conf:
|
if CONF_UPDATE_ON_BOOT in config:
|
||||||
add(my9231.set_update(conf[CONF_UPDATE_ON_BOOT]))
|
add(my9231.set_update(config[CONF_UPDATE_ON_BOOT]))
|
||||||
setup_component(my9231, conf)
|
setup_component(my9231, config)
|
||||||
|
|
||||||
|
|
||||||
BUILD_FLAGS = '-DUSE_MY9231_OUTPUT'
|
BUILD_FLAGS = '-DUSE_MY9231_OUTPUT'
|
||||||
|
|
|
@ -2,12 +2,11 @@ import logging
|
||||||
|
|
||||||
import voluptuous as vol
|
import voluptuous as vol
|
||||||
|
|
||||||
from esphomeyaml import core
|
|
||||||
import esphomeyaml.config_validation as cv
|
import esphomeyaml.config_validation as cv
|
||||||
from esphomeyaml.const import CONF_ID, CONF_OTA, CONF_PASSWORD, CONF_PORT, CONF_SAFE_MODE, \
|
from esphomeyaml.const import CONF_ID, CONF_OTA, CONF_PASSWORD, CONF_PORT, CONF_SAFE_MODE
|
||||||
ESP_PLATFORM_ESP32, ESP_PLATFORM_ESP8266
|
from esphomeyaml.core import CORE
|
||||||
from esphomeyaml.core import ESPHomeYAMLError
|
from esphomeyaml.cpp_generator import Pvariable, add
|
||||||
from esphomeyaml.helpers import App, Pvariable, add, esphomelib_ns, Component
|
from esphomeyaml.cpp_types import App, Component, esphomelib_ns
|
||||||
|
|
||||||
_LOGGER = logging.getLogger(__name__)
|
_LOGGER = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
@ -35,11 +34,11 @@ def to_code(config):
|
||||||
def get_port(config):
|
def get_port(config):
|
||||||
if CONF_PORT in config[CONF_OTA]:
|
if CONF_PORT in config[CONF_OTA]:
|
||||||
return config[CONF_OTA][CONF_PORT]
|
return config[CONF_OTA][CONF_PORT]
|
||||||
if core.ESP_PLATFORM == ESP_PLATFORM_ESP32:
|
if CORE.is_esp32:
|
||||||
return 3232
|
return 3232
|
||||||
elif core.ESP_PLATFORM == ESP_PLATFORM_ESP8266:
|
if CORE.is_esp8266:
|
||||||
return 8266
|
return 8266
|
||||||
raise ESPHomeYAMLError(u"Invalid ESP Platform for ESP OTA port.")
|
raise NotImplementedError
|
||||||
|
|
||||||
|
|
||||||
def get_auth(config):
|
def get_auth(config):
|
||||||
|
@ -51,6 +50,8 @@ REQUIRED_BUILD_FLAGS = '-DUSE_NEW_OTA'
|
||||||
|
|
||||||
|
|
||||||
def lib_deps(config):
|
def lib_deps(config):
|
||||||
if core.ESP_PLATFORM == ESP_PLATFORM_ESP32:
|
if CORE.is_esp32:
|
||||||
return ['ArduinoOTA', 'Update', 'ESPmDNS']
|
return ['Update', 'ESPmDNS']
|
||||||
return ['Hash', 'ESP8266mDNS', 'ArduinoOTA']
|
if CORE.is_esp8266:
|
||||||
|
return ['Hash', 'ESP8266mDNS']
|
||||||
|
raise NotImplementedError
|
||||||
|
|
|
@ -4,8 +4,9 @@ from esphomeyaml.automation import maybe_simple_id, ACTION_REGISTRY
|
||||||
import esphomeyaml.config_validation as cv
|
import esphomeyaml.config_validation as cv
|
||||||
from esphomeyaml.components.power_supply import PowerSupplyComponent
|
from esphomeyaml.components.power_supply import PowerSupplyComponent
|
||||||
from esphomeyaml.const import CONF_INVERTED, CONF_MAX_POWER, CONF_POWER_SUPPLY, CONF_ID, CONF_LEVEL
|
from esphomeyaml.const import CONF_INVERTED, CONF_MAX_POWER, CONF_POWER_SUPPLY, CONF_ID, CONF_LEVEL
|
||||||
from esphomeyaml.helpers import add, esphomelib_ns, get_variable, TemplateArguments, Pvariable, \
|
from esphomeyaml.core import CORE
|
||||||
templatable, float_, add_job, Action
|
from esphomeyaml.cpp_generator import add, get_variable, Pvariable, templatable
|
||||||
|
from esphomeyaml.cpp_types import esphomelib_ns, Action, float_
|
||||||
|
|
||||||
PLATFORM_SCHEMA = cv.PLATFORM_SCHEMA.extend({
|
PLATFORM_SCHEMA = cv.PLATFORM_SCHEMA.extend({
|
||||||
|
|
||||||
|
@ -26,7 +27,9 @@ FLOAT_OUTPUT_PLATFORM_SCHEMA = PLATFORM_SCHEMA.extend(FLOAT_OUTPUT_SCHEMA.schema
|
||||||
|
|
||||||
output_ns = esphomelib_ns.namespace('output')
|
output_ns = esphomelib_ns.namespace('output')
|
||||||
BinaryOutput = output_ns.class_('BinaryOutput')
|
BinaryOutput = output_ns.class_('BinaryOutput')
|
||||||
|
BinaryOutputPtr = BinaryOutput.operator('ptr')
|
||||||
FloatOutput = output_ns.class_('FloatOutput', BinaryOutput)
|
FloatOutput = output_ns.class_('FloatOutput', BinaryOutput)
|
||||||
|
FloatOutputPtr = FloatOutput.operator('ptr')
|
||||||
|
|
||||||
# Actions
|
# Actions
|
||||||
TurnOffAction = output_ns.class_('TurnOffAction', Action)
|
TurnOffAction = output_ns.class_('TurnOffAction', Action)
|
||||||
|
@ -47,7 +50,12 @@ def setup_output_platform_(obj, config, skip_power_supply=False):
|
||||||
|
|
||||||
|
|
||||||
def setup_output_platform(obj, config, skip_power_supply=False):
|
def setup_output_platform(obj, config, skip_power_supply=False):
|
||||||
add_job(setup_output_platform_, obj, config, skip_power_supply)
|
CORE.add_job(setup_output_platform_, obj, config, skip_power_supply)
|
||||||
|
|
||||||
|
|
||||||
|
def register_output(var, config):
|
||||||
|
output_var = Pvariable(config[CONF_ID], var, has_side_effects=True)
|
||||||
|
CORE.add_job(setup_output_platform_, output_var, config)
|
||||||
|
|
||||||
|
|
||||||
BUILD_FLAGS = '-DUSE_OUTPUT'
|
BUILD_FLAGS = '-DUSE_OUTPUT'
|
||||||
|
@ -60,8 +68,7 @@ OUTPUT_TURN_ON_ACTION = maybe_simple_id({
|
||||||
|
|
||||||
|
|
||||||
@ACTION_REGISTRY.register(CONF_OUTPUT_TURN_ON, OUTPUT_TURN_ON_ACTION)
|
@ACTION_REGISTRY.register(CONF_OUTPUT_TURN_ON, OUTPUT_TURN_ON_ACTION)
|
||||||
def output_turn_on_to_code(config, action_id, arg_type):
|
def output_turn_on_to_code(config, action_id, arg_type, template_arg):
|
||||||
template_arg = TemplateArguments(arg_type)
|
|
||||||
for var in get_variable(config[CONF_ID]):
|
for var in get_variable(config[CONF_ID]):
|
||||||
yield None
|
yield None
|
||||||
rhs = var.make_turn_on_action(template_arg)
|
rhs = var.make_turn_on_action(template_arg)
|
||||||
|
@ -76,8 +83,7 @@ OUTPUT_TURN_OFF_ACTION = maybe_simple_id({
|
||||||
|
|
||||||
|
|
||||||
@ACTION_REGISTRY.register(CONF_OUTPUT_TURN_OFF, OUTPUT_TURN_OFF_ACTION)
|
@ACTION_REGISTRY.register(CONF_OUTPUT_TURN_OFF, OUTPUT_TURN_OFF_ACTION)
|
||||||
def output_turn_off_to_code(config, action_id, arg_type):
|
def output_turn_off_to_code(config, action_id, arg_type, template_arg):
|
||||||
template_arg = TemplateArguments(arg_type)
|
|
||||||
for var in get_variable(config[CONF_ID]):
|
for var in get_variable(config[CONF_ID]):
|
||||||
yield None
|
yield None
|
||||||
rhs = var.make_turn_off_action(template_arg)
|
rhs = var.make_turn_off_action(template_arg)
|
||||||
|
@ -93,8 +99,7 @@ OUTPUT_SET_LEVEL_ACTION = vol.Schema({
|
||||||
|
|
||||||
|
|
||||||
@ACTION_REGISTRY.register(CONF_OUTPUT_SET_LEVEL, OUTPUT_SET_LEVEL_ACTION)
|
@ACTION_REGISTRY.register(CONF_OUTPUT_SET_LEVEL, OUTPUT_SET_LEVEL_ACTION)
|
||||||
def output_set_level_to_code(config, action_id, arg_type):
|
def output_set_level_to_code(config, action_id, arg_type, template_arg):
|
||||||
template_arg = TemplateArguments(arg_type)
|
|
||||||
for var in get_variable(config[CONF_ID]):
|
for var in get_variable(config[CONF_ID]):
|
||||||
yield None
|
yield None
|
||||||
rhs = var.make_set_level_action(template_arg)
|
rhs = var.make_set_level_action(template_arg)
|
||||||
|
|
66
esphomeyaml/components/output/custom.py
Normal file
|
@ -0,0 +1,66 @@
|
||||||
|
import voluptuous as vol
|
||||||
|
|
||||||
|
from esphomeyaml.components import output
|
||||||
|
import esphomeyaml.config_validation as cv
|
||||||
|
from esphomeyaml.const import CONF_ID, CONF_LAMBDA, CONF_OUTPUTS, CONF_TYPE
|
||||||
|
from esphomeyaml.cpp_generator import process_lambda, variable
|
||||||
|
from esphomeyaml.cpp_types import std_vector
|
||||||
|
|
||||||
|
CustomBinaryOutputConstructor = output.output_ns.class_('CustomBinaryOutputConstructor')
|
||||||
|
CustomFloatOutputConstructor = output.output_ns.class_('CustomFloatOutputConstructor')
|
||||||
|
|
||||||
|
BINARY_SCHEMA = output.PLATFORM_SCHEMA.extend({
|
||||||
|
cv.GenerateID(): cv.declare_variable_id(CustomBinaryOutputConstructor),
|
||||||
|
vol.Required(CONF_LAMBDA): cv.lambda_,
|
||||||
|
vol.Required(CONF_OUTPUTS):
|
||||||
|
cv.ensure_list(output.BINARY_OUTPUT_SCHEMA.extend({
|
||||||
|
cv.GenerateID(): cv.declare_variable_id(output.BinaryOutput),
|
||||||
|
})),
|
||||||
|
})
|
||||||
|
|
||||||
|
FLOAT_SCHEMA = output.PLATFORM_SCHEMA.extend({
|
||||||
|
cv.GenerateID(): cv.declare_variable_id(CustomFloatOutputConstructor),
|
||||||
|
vol.Required(CONF_LAMBDA): cv.lambda_,
|
||||||
|
vol.Required(CONF_OUTPUTS):
|
||||||
|
cv.ensure_list(output.FLOAT_OUTPUT_PLATFORM_SCHEMA.extend({
|
||||||
|
cv.GenerateID(): cv.declare_variable_id(output.FloatOutput),
|
||||||
|
})),
|
||||||
|
})
|
||||||
|
|
||||||
|
|
||||||
|
def validate_custom_output(value):
|
||||||
|
if not isinstance(value, dict):
|
||||||
|
raise vol.Invalid("Value must be dict")
|
||||||
|
if CONF_TYPE not in value:
|
||||||
|
raise vol.Invalid("type not specified!")
|
||||||
|
type = cv.string_strict(value[CONF_TYPE]).lower()
|
||||||
|
value[CONF_TYPE] = type
|
||||||
|
if type == 'binary':
|
||||||
|
return BINARY_SCHEMA(value)
|
||||||
|
if type == 'float':
|
||||||
|
return FLOAT_SCHEMA(value)
|
||||||
|
raise vol.Invalid("type must either be binary or float, not {}!".format(type))
|
||||||
|
|
||||||
|
|
||||||
|
PLATFORM_SCHEMA = validate_custom_output
|
||||||
|
|
||||||
|
|
||||||
|
def to_code(config):
|
||||||
|
type = config[CONF_TYPE]
|
||||||
|
if type == 'binary':
|
||||||
|
ret_type = output.BinaryOutputPtr
|
||||||
|
klass = CustomBinaryOutputConstructor
|
||||||
|
else:
|
||||||
|
ret_type = output.FloatOutputPtr
|
||||||
|
klass = CustomFloatOutputConstructor
|
||||||
|
for template_ in process_lambda(config[CONF_LAMBDA], [],
|
||||||
|
return_type=std_vector.template(ret_type)):
|
||||||
|
yield
|
||||||
|
|
||||||
|
rhs = klass(template_)
|
||||||
|
custom = variable(config[CONF_ID], rhs)
|
||||||
|
for i, sens in enumerate(config[CONF_OUTPUTS]):
|
||||||
|
output.register_output(custom.get_output(i), sens)
|
||||||
|
|
||||||
|
|
||||||
|
BUILD_FLAGS = '-DUSE_CUSTOM_OUTPUT'
|
|
@ -3,9 +3,10 @@ import voluptuous as vol
|
||||||
from esphomeyaml import pins
|
from esphomeyaml import pins
|
||||||
from esphomeyaml.components import output
|
from esphomeyaml.components import output
|
||||||
import esphomeyaml.config_validation as cv
|
import esphomeyaml.config_validation as cv
|
||||||
from esphomeyaml.const import CONF_ID, CONF_NUMBER, CONF_PIN, ESP_PLATFORM_ESP8266, CONF_FREQUENCY
|
from esphomeyaml.const import CONF_FREQUENCY, CONF_ID, CONF_NUMBER, CONF_PIN, ESP_PLATFORM_ESP8266
|
||||||
from esphomeyaml.helpers import App, Component, Pvariable, gpio_output_pin_expression, \
|
from esphomeyaml.cpp_generator import Pvariable, add
|
||||||
setup_component, add
|
from esphomeyaml.cpp_helpers import gpio_output_pin_expression, setup_component
|
||||||
|
from esphomeyaml.cpp_types import App, Component
|
||||||
|
|
||||||
ESP_PLATFORMS = [ESP_PLATFORM_ESP8266]
|
ESP_PLATFORMS = [ESP_PLATFORM_ESP8266]
|
||||||
|
|
||||||
|
|
|
@ -1,11 +1,12 @@
|
||||||
import voluptuous as vol
|
import voluptuous as vol
|
||||||
|
|
||||||
from esphomeyaml import pins
|
from esphomeyaml import pins
|
||||||
import esphomeyaml.config_validation as cv
|
|
||||||
from esphomeyaml.components import output
|
from esphomeyaml.components import output
|
||||||
|
import esphomeyaml.config_validation as cv
|
||||||
from esphomeyaml.const import CONF_ID, CONF_PIN
|
from esphomeyaml.const import CONF_ID, CONF_PIN
|
||||||
from esphomeyaml.helpers import App, Pvariable, gpio_output_pin_expression, setup_component, \
|
from esphomeyaml.cpp_generator import Pvariable
|
||||||
Component
|
from esphomeyaml.cpp_helpers import gpio_output_pin_expression, setup_component
|
||||||
|
from esphomeyaml.cpp_types import App, Component
|
||||||
|
|
||||||
GPIOBinaryOutputComponent = output.output_ns.class_('GPIOBinaryOutputComponent',
|
GPIOBinaryOutputComponent = output.output_ns.class_('GPIOBinaryOutputComponent',
|
||||||
output.BinaryOutput, Component)
|
output.BinaryOutput, Component)
|
||||||
|
|
|
@ -1,11 +1,13 @@
|
||||||
import voluptuous as vol
|
import voluptuous as vol
|
||||||
|
|
||||||
import esphomeyaml.config_validation as cv
|
|
||||||
from esphomeyaml import pins
|
from esphomeyaml import pins
|
||||||
from esphomeyaml.components import output
|
from esphomeyaml.components import output
|
||||||
|
import esphomeyaml.config_validation as cv
|
||||||
from esphomeyaml.const import APB_CLOCK_FREQ, CONF_BIT_DEPTH, CONF_CHANNEL, CONF_FREQUENCY, \
|
from esphomeyaml.const import APB_CLOCK_FREQ, CONF_BIT_DEPTH, CONF_CHANNEL, CONF_FREQUENCY, \
|
||||||
CONF_ID, CONF_PIN, ESP_PLATFORM_ESP32
|
CONF_ID, CONF_PIN, ESP_PLATFORM_ESP32
|
||||||
from esphomeyaml.helpers import App, Pvariable, add, setup_component, Component
|
from esphomeyaml.cpp_generator import Pvariable, add
|
||||||
|
from esphomeyaml.cpp_helpers import setup_component
|
||||||
|
from esphomeyaml.cpp_types import App, Component
|
||||||
|
|
||||||
ESP_PLATFORMS = [ESP_PLATFORM_ESP32]
|
ESP_PLATFORMS = [ESP_PLATFORM_ESP32]
|
||||||
|
|
||||||
|
|