Professional-Cloud-Architect Exam Questions

Total 251 Questions

Last Updated Exam : 22-Oct-2024

Topic 2, TerramEarth Case Study

   

Company Overview
TerramEarth manufactures heavy equipment for the mining and agricultural industries: About 80% of their
business is from mining and 20% from agriculture. They currently have over 500 dealers and service centers in
100 countries. Their mission is to build products that make their customers more productive.
Company Background
TerramEarth formed in 1946, when several small, family owned companies combined to retool after World
War II. The company cares about their employees and customers and considers them to be extended members
of their family.
TerramEarth is proud of their ability to innovate on their core products and find new markets as their
customers' needs change. For the past 20 years trends in the industry have been largely toward increasing
productivity by using larger vehicles with a human operator.
Solution Concept
There are 20 million TerramEarth vehicles in operation that collect 120 fields of data per second. Data is
stored locally on the vehicle and can be accessed for analysis when a vehicle is serviced. The data is
downloaded via a maintenance port. This same port can be used to adjust operational parameters, allowing the
vehicles to be upgraded in the field with new computing modules.
Approximately 200,000 vehicles are connected to a cellular network, allowing TerramEarth to collect data
directly. At a rate of 120 fields of data per second, with 22 hours of operation per day. TerramEarth collects a
total of about 9 TB/day from these connected vehicles.
Existing Technical Environment

TerramEarth’s existing architecture is composed of Linux-based systems that reside in a data center. These
systems gzip CSV files from the field and upload via FTP, transform and aggregate them, and place the data in
their data warehouse. Because this process takes time, aggregated reports are based on data that is 3 weeks old.
With this data, TerramEarth has been able to preemptively stock replacement parts and reduce unplanned
downtime of their vehicles by 60%. However, because the data is stale, some customers are without their
vehicles for up to 4 weeks while they wait for replacement parts.
Business Requirements
• Decrease unplanned vehicle downtime to less than 1 week, without increasing the cost of carrying surplus
inventory
• Support the dealer network with more data on how their customers use their equipment IP better position
new products and services.
• Have the ability to partner with different companies-especially with seed and fertilizer suppliers in the
fast-growing agricultural business-to create compelling joint offerings for their customers
CEO Statement
We have been successful in capitalizing on the trend toward larger vehicles to increase the productivity of our
customers. Technological change is occurring rapidly and TerramEarth has taken advantage of connected
devices technology to provide our customers with better services, such as our intelligent farming equipment.
With this technology, we have been able to increase farmers' yields by 25%, by using past trends to adjust how
our vehicles operate. These advances have led to the rapid growth of our agricultural product line, which we
expect will generate 50% of our revenues by 2020.
CTO Statement
Our competitive advantage has always been in the manufacturing process with our ability to build better
vehicles for tower cost than our competitors. However, new products with different approaches are constantly
being developed, and I'm concerned that we lack the skills to undergo the next wave of transformations in our
industry. Unfortunately, our CEO doesn't take technology obsolescence seriously and he considers the many
new companies in our industry to be niche players. My goals are to build our skills while addressing
immediate market needs through incremental innovations.

For this question, refer to the TerramEarth case study
Your development team has created a structured API to retrieve vehicle data. They want to allow third parties
to develop tools for dealerships that use this vehicle event data. You want to support delegated authorization
against this data. What should you do?


A.

Build or leverage an OAuth-compatible access control system.


B.

Build SAML 2.0 SSO compatibility into your authentication system.


C.

Restrict data access based on the source IP address of the partner systems.


D.

Create secondary credentials for each dealer that can be given to the trusted third party.





A.
  

Build or leverage an OAuth-compatible access control system.



https://cloud.google.com/appengine/docs/flexible/go/authorizing-apps
https://cloud.google.com/docs/enterprise/best-practices-for-enterprise-organizations#delegate_application_authorization_

For this question, refer to the TerramEarth case study.
To speed up data retrieval, more vehicles will be upgraded to cellular connections and be able to transmit data
to the ETL process. The current FTP process is error-prone and restarts the data transfer from the start of the
file when connections fail, which happens often. You want to improve the reliability of the solution and
minimize data transfer time on the cellular connections. What should you do?


A.

Use one Google Container Engine cluster of FTP servers. Save the data to a Multi-Regional bucket. Run
the ETL process using data in the bucket.


B.

Use multiple Google Container Engine clusters running FTP servers located in different regions. Save
the data to Multi-Regional buckets in us, eu, and asia. Run the ETL process using the data in the bucket.


C.

Directly transfer the files to different Google Cloud Multi-Regional Storage bucket locations in us, eu,
and asia using Google APIs over HTTP(S). Run the ETL process using the data in the bucket.


D.

Directly transfer the files to a different Google Cloud Regional Storage bucket location in us, eu, and asia using Google APIs over HTTP(S). Run the ETL process to retrieve the data from each Regional
bucket.





D.
  

Directly transfer the files to a different Google Cloud Regional Storage bucket location in us, eu, and asia using Google APIs over HTTP(S). Run the ETL process to retrieve the data from each Regional
bucket.



For this question, refer to the TerramEarth case study.
TerramEarth has equipped unconnected trucks with servers and sensors to collet telemetry data. Next year they
want to use the data to train machine learning models. They want to store this data in the cloud while reducing
costs. What should they do?


A.

Have the vehicle’ computer compress the data in hourly snapshots, and store it in a Google Cloud
storage (GCS) Nearline bucket.


B.

Push the telemetry data in Real-time to a streaming dataflow job that compresses the data, and store it in
Google BigQuery.


C.

Push the telemetry data in real-time to a streaming dataflow job that compresses the data, and store it in
Cloud Bigtable.


D.

Have the vehicle's computer compress the data in hourly snapshots, a Store it in a GCS Coldline bucket





D.
  

Have the vehicle's computer compress the data in hourly snapshots, a Store it in a GCS Coldline bucket



For this question refer to the TerramEarth case study.
Which of TerramEarth's legacy enterprise processes will experience significant change as a result of increased
Google Cloud Platform adoption.


A.

Opex/capex allocation, LAN changes, capacity planning


B.

Capacity planning, TCO calculations, opex/capex allocation


C.

Capacity planning, utilization measurement, data center expansion


D.

Data Center expansion, TCO calculations, utilization measurement





B.
  

Capacity planning, TCO calculations, opex/capex allocation



For this question, refer to the TerramEarth case study.
TerramEarth plans to connect all 20 million vehicles in the field to the cloud. This increases the volume to 20
million 600 byte records a second for 40 TB an hour. How should you design the data ingestion?


A.

Vehicles write data directly to GCS.


B.

Vehicles write data directly to Google Cloud Pub/Sub.


C.

Vehicles stream data directly to Google BigQuery.


D.

Vehicles continue to write data using the existing system (FTP).





B.
  

Vehicles write data directly to Google Cloud Pub/Sub.



Scale to hundreds of millions of messages per second and pay only for the resources you use. There are no
partitions or local instances to manage, reducing operational overhead. Data is automatically and intelligently
distributed across data centers over our unique, high-speed private network.
TerramEarth’s existing architecture is composed of Linux-based systems that reside in a data center. These
systems gzip CSV files from the field and upload via FTP, transform and aggregate them, and place the data in
their data warehouse. Because this process takes time, aggregated reports are based on data that is 3 weeks old.
With this data, TerramEarth has been able to preemptively stock replacement parts and reduce unplanned
downtime of their vehicles by 60%. However, because the data is stale, some customers are without their
vehicles for up to 4 weeks while they wait for replacement parts.
https://cloud.google.com/pubsub/

Your agricultural division is experimenting with fully autonomous vehicles.
You want your architecture to promote strong security during vehicle operation.
Which two architecture should you consider?
Choose 2 answers:


A.

Treat every micro service call between modules on the vehicle as untrusted.


B.

Require IPv6 for connectivity to ensure a secure address space.


C.

Use a trusted platform module (TPM) and verify firmware and binaries on boot.


D.

Use a functional programming language to isolate code execution cycles.


E.

Use multiple connectivity subsystems for redundancy.


F.

Enclose the vehicle's drive electronics in a Faraday cage to isolate chips.





C.
  

Use a trusted platform module (TPM) and verify firmware and binaries on boot.



F.
  

Enclose the vehicle's drive electronics in a Faraday cage to isolate chips.



For this question, refer to the TerramEarth case study.
The TerramEarth development team wants to create an API to meet the company's business requirements. You
want the development team to focus their development effort on business value versus creating a custom
framework. Which method should they use?


A.

Use Google App Engine with Google Cloud Endpoints. Focus on an API for dealers and partners.


B.

Use Google App Engine with a JAX-RS Jersey Java-based framework. Focus on an API for the public.


C.

Use Google App Engine with the Swagger (open API Specification) framework. Focus on an API for
the public.


D.

Use Google Container Engine with a Django Python container. Focus on an API for the public.


E.

Use Google Container Engine with a Tomcat container with the Swagger (Open API Specification)
framework. Focus on an API for dealers and partners





A.
  

Use Google App Engine with Google Cloud Endpoints. Focus on an API for dealers and partners.



https://cloud.google.com/endpoints/docs/openapi/about-cloud-endpoints?hl=en_US&_ga=2.21787131.-1712523161.1522785064
https://cloud.google.com/endpoints/docs/openapi/architecture-overview
https://cloud.google.com/storage/docs/gsutil/commands/test

For this question, refer to the JencoMart case study.
JencoMart has built a version of their application on Google Cloud Platform that serves traffic to Asia. You
want to measure success against their business and technical goals. Which metrics should you track?
Error rates for requests from Asia


A.

Latency difference between US and Asia


B.

Total visits, error rates, and latency from Asia


C.

Total visits and average latency for users in Asia


D.

The number of character sets present in the database





D.
  

The number of character sets present in the database



For this question, refer to the JencoMart case study.
The migration of JencoMart’s application to Google Cloud Platform (GCP) is progressing too slowly. The
infrastructure is shown in the diagram. You want to maximize throughput. What are three potential
bottlenecks? (Choose 3 answers.)



A.

A single VPN tunnel, which limits throughput


B.

A tier of Google Cloud Storage that is not suited for this task


C.

A copy command that is not suited to operate over long distances


D.

Fewer virtual machines (VMs) in GCP than on-premises machines


E.

A separate storage layer outside the VMs, which is not suited for this task


F.

Complicated internet connectivity between the on-premises infrastructure and GCP





A.
  

A single VPN tunnel, which limits throughput



E.
  

A separate storage layer outside the VMs, which is not suited for this task



F.
  

Complicated internet connectivity between the on-premises infrastructure and GCP



For this question, refer to the JencoMart case study.
JencoMart has decided to migrate user profile storage to Google Cloud Datastore and the application servers to
Google Compute Engine (GCE). During the migration, the existing infrastructure will need access to Datastore
to upload the data. What service account key-management strategy should you recommend?


A.

Provision service account keys for the on-premises infrastructure and for the GCE virtual machines
(VMs).


B.

Authenticate the on-premises infrastructure with a user account and provision service account keys for
the VMs.


C.

Provision service account keys for the on-premises infrastructure and use Google Cloud Platform (GCP)
managed keys for the VMs


D.

Deploy a custom authentication service on GCE/Google Container Engine (GKE) for the on-premises
infrastructure and use GCP managed keys for the VMs





C.
  

Provision service account keys for the on-premises infrastructure and use Google Cloud Platform (GCP)
managed keys for the VMs



https://cloud.google.com/iam/docs/understanding-service-accounts


Page 2 out of 26 Pages
Previous