| Aug | SEP | Oct |
| 17 | ||
| 2019 | 2020 | 2021 |
COLLECTED BY
Collection: Live Web Proxy Crawls
Tekton
Kubernetes-native resources for declaring CI/CD pipelines.
Cost Management
Tools for monitoring, controlling, and optimizing your costs.
●Media and Gaming
Zync Render
Platform for 3D modeling and rendering on Google Cloud infrastructure.
Anvato
Media content platform for OTT services and video streaming.
OpenCue
Open source render manager for visual effects and animation.
| Resizable clusters | Create and scale clusters quickly with various virtual machine types, disk sizes, number of nodes, and networking options. |
| Autoscaling clusters | Dataproc autoscaling provides a mechanism for automating cluster resource management and enables automatic addition and subtraction of cluster workers (nodes). |
| Cloud integrated | Built-in integration with Cloud Storage, BigQuery, Cloud Bigtable, Cloud Logging, Cloud Monitoring, and AI Hub, giving you a more complete and robust data platform. |
| Versioning | Image versioning allows you to switch between different versions of Apache Spark, Apache Hadoop, and other tools. |
| Highly available | Run clusters in high availability mode with multiple master nodes and set jobs to restart on failure to help ensure your clusters and jobs are highly available. |
| Cluster scheduled deletion | To help avoid incurring charges for an inactive cluster, you can use Dataproc's scheduled deletion, which provides options to delete a cluster after a specified cluster idle period, at a specified future time, or after a specified time period. |
| Automatic or manual configuration | Dataproc automatically configures hardware and software but also gives you manual control. |
| Developer tools | Multiple ways to manage a cluster, including an easy-to-use web UI, the Cloud SDK, RESTful APIs, and SSH access. |
| Initialization actions | Run initialization actions to install or customize the settings and libraries you need when your cluster is created. |
| Optional components | Use optional components to install and configure additional components on the cluster. Optional components are integrated with Dataproc components and offer fully configured environments for Zeppelin, Druid, Presto, and other open source software components related to the Apache Hadoop and Apache Spark ecosystem. |
| Custom images | Dataproc clusters can be provisioned with a custom image that includes your pre-installed Linux operating system packages. |
| Flexible virtual machines | Clusters can use custom machine types and preemptible virtual machines to make them the perfect size for your needs. |
| Component Gateway and notebook access | Dataproc Component Gateway enables secure, one-click access to Dataproc default and optional component web interfaces running on the cluster. |
| Workflow templates | Dataproc workflow templates provide a flexible and easy-to-use mechanism for managing and executing workflows. A workflow template is a reusable workflow configuration that defines a graph of jobs with information on where to run those jobs. |