You are deploying a microservices application to Google Kubernetes Engine (GKE). The application will receive daily updates. You expect to deploy a large number of distinct containers that will run on the Linux operating system (OS). You want to be alerted to any known OS vulnerabilities in the new containers. You want to follow Google-recommended best practices. What should you do?
A. Use the gcloud CLI to call Container Analysis to scan new container images. Review the vulnerability results before each deployment.
B. Enable Container Analysis, and upload new container images to Artifact Registry. Review the vulnerability results before each deployment.
C. Enable Container Analysis, and upload new container images to Artifact Registry. Review the critical vulnerability results before each deployment.
D. Use the Container Analysis REST API to call Container Analysis to scan new container images. Review the vulnerability results before each deployment.
You need to redesign the ingestion of audit events from your authentication service to allow it to handle a large increase in traffic. Currently, the audit service and the authentication system run in the same Compute Engine virtual machine.
You plan to use the following Google Cloud tools in the new architecture:
Multiple Compute Engine machines, each running an instance of the authentication service
Multiple Compute Engine machines, each running an instance of the audit service
Pub/Sub to send the events from the authentication services.
How should you set up the topics and subscriptions to ensure that the system can handle a large volume of messages and can scale efficiently?
A. Create one Pub/Sub topic. Create one pull subscription to allow the audit services to share the messages.
B. Create one Pub/Sub topic. Create one pull subscription per audit service instance to allow the services to share the messages.
C. Create one Pub/Sub topic. Create one push subscription with the endpoint pointing to a load balancer in front of the audit services.
D. Create one Pub/Sub topic per authentication service. Create one pull subscription per topic to be used by one audit service.
E. Create one Pub/Sub topic per authentication service. Create one push subscription per topic, with the endpoint pointing to one audit service.
You are in the final stage of migrating an on-premises data center to Google Cloud. You are quickly approaching your deadline, and discover that a web API is running on a server slated for decommissioning. You need to recommend a solution to modernize this API while migrating to Google Cloud. The modernized web API must meet the following requirements: ?Autoscales during high traffic periods at the end of each month ?Written in Python 3.x ?Developers must be able to rapidly deploy new versions in response to frequent code changes You want to minimize cost, effort, and operational overhead of this migration. What should you do?
A. Modernize and deploy the code on App Engine flexible environment.
B. Modernize and deploy the code on App Engine standard environment.
C. Deploy the modernized application to an n1-standard-1 Compute Engine instance.
D. Ask the development team to re-write the application to run as a Docker container on Google Kubernetes Engine.
You have a web application that publishes messages to Pub/Sub. You plan to build new versions of the application locally and need to quickly test Pub/Sub integration tor each new build. How should you configure local testing?
A. Run the gclcud config set api_endpoint_overrides/pubsub https: / 'pubsubemulator.googleapi3.com. coin/ command to change the Pub/Sub endpoint prior to starting the application
B. In the Google Cloud console, navigate to the API Library and enable the Pub/Sub API When developing locally, configure your application to call pubsub.googleapis com
C. Install Cloud Code on the integrated development environment (IDE) Navigate to Cloud APIs, and enable Pub/Sub against a valid Google Project ID. When developing locally, configure your application to call pubsub.googleapis com
D. Install the Pub/Sub emulator using gcloud and start the emulator with a valid Google Project ID. When developing locally, configure your application to use the local emulator by exporting the fuhsub emulator Host variable
You recently migrated an on-premises monolithic application to a microservices application on Google Kubernetes Engine (GKE). The application has dependencies on backend services on-premises, including a CRM system and a MySQL database that contains personally identifiable information (PII). The backend services must remain on-premises to meet regulatory requirements.
You established a Cloud VPN connection between your on-premises data center and Google Cloud. You notice that some requests from your microservices application on GKE to the backend services are failing due to latency issues caused by fluctuating bandwidth, which is causing the application to crash. How should you address the latency issues?
A. Use Memorystore to cache frequently accessed PII data from the on-premises MySQL database
B. Use Istio to create a service mesh that includes the microservices on GKE and the on- premises services
C. Increase the number of Cloud VPN tunnels for the connection between Google Cloud and the on-premises services
D. Decrease the network layer packet size by decreasing the Maximum Transmission Unit (MTU) value from its default value on Cloud VPN
You are developing a microservice-based application that will be deployed on a Google Kubernetes Engine cluster. The application needs to read and write to a Spanner database. You want to follow security best practices while minimizing code changes. How should you configure your application to retrieve Spanner credentials?
A. Configure the appropriate service accounts, and use Workload Identity to run the pods.
B. Store the application credentials as Kubernetes Secrets, and expose them as environment variables.
C. Configure the appropriate routing rules, and use a VPC-native cluster to directly connect to the database.
D. Store the application credentials using Cloud Key Management Service, and retrieve them whenever a database connection is made.
Your company has deployed a new API to a Compute Engine instance. During testing, the API is not behaving as expected. You want to monitor the application over 12 hours to diagnose the problem within the application code without
redeploying the application.
Which tool should you use?
A. Cloud Trace
B. Cloud Monitoring
C. Cloud Debugger logpoints
D. Cloud Debugger snapshots
You are building an API that will be used by Android and iOS apps The API must: ?Support HTTPs ?Minimize bandwidth cost ?Integrate easily with mobile apps Which API architecture should you use?
A. RESTful APIs
B. MQTT for APIs
C. gRPC-based APIs
D. SOAP-based APIs
Your company has a BigQuery data mart that provides analytics information to hundreds of employees. One user of wants to run jobs without interrupting important workloads. This user isn't concerned about the time it takes to run these jobs. You want to fulfill this request while minimizing cost to the company and the effort required on your part. What should you do?
A. Ask the user to run the jobs as batch jobs.
B. Create a separate project for the user to run jobs.
C. Add the user as a job.user role in the existing project.
D. Allow the user to run jobs when important workloads are not running.
You are a developer at a large organization. You have an application written in Go running in a production Google Kubernetes Engine (GKE) cluster. You need to add a new feature that requires access to BigQuery. You want to grant BigQuery access to your GKE cluster following Google-recommended best practices. What should you do?
A. Create a Google service account with BigQuery access. Add the JSON key to Secret Manager, and use the Go client library to access the JSON key.
B. Create a Google service account with BigQuery access. Add the Google service account JSON key as a Kubernetes secret, and configure the application to use this secret.
C. Create a Google service account with BigQuery access. Add the Google service account JSON key to Secret Manager, and use an init container to access the secret for the application to use.
D. Create a Google service account and a Kubernetes service account. Configure Workload Identity on the GKE cluster, and reference the Kubernetes service account on the application Deployment.
Nowadays, the certification exams become more and more important and required by more and more enterprises when applying for a job. But how to prepare for the exam effectively? How to prepare for the exam in a short time with less efforts? How to get a ideal result and how to find the most reliable resources? Here on Vcedump.com, you will find all the answers. Vcedump.com provide not only Google exam questions, answers and explanations but also complete assistance on your exam preparation and certification application. If you are confused on your PROFESSIONAL-CLOUD-DEVELOPER exam preparations and Google certification application, do not hesitate to visit our Vcedump.com to find your solutions here.