Visit Official SkillCertPro Website :-
For a full set of 800 questions. Go to
https://skillcertpro.com/product/google-cloud-certified-associate-cloud-engineer-practice-exam-set/
SkillCertPro offers detailed explanations to each question which helps to understand the concepts better.
It is recommended to score above 85% in SkillCertPro exams before attempting a real exam.
SkillCertPro updates exam questions every 2 weeks.
You will get life time access and life time free updates
SkillCertPro assures 100% pass guarantee in first attempt.
Question 1:
You are building a banking-related application on Google Kubernetes Engine. Your security team has given the following requirements for the cluster:
The cluster should have verifiable node identity and integrity
The nodes should not be accessible from the internet.
What should you do to honor these requirements while keeping operational costs to a minimum?
A. Deploy a private autopilot cluster
B. Deploy a public autopilot cluster
C. Deploy a standard public cluster and enable shielded nodes
D. Deploy a standard private cluster and enable shielded nodes
Answer: D
Explanation:
A is incorrect because deploying a private autopilot cluster would not meet the requirement of nodes not being accessible from the internet. Autopilot clusters are fully managed and do not have the option to restrict internet access.
B is incorrect because deploying a public autopilot cluster would not meet the requirement of nodes not being accessible from the internet. Autopilot clusters are fully managed and do not have the option to restrict internet access.
C is incorrect because deploying a standard public cluster and enabling shielded nodes would not meet the requirement of nodes not being accessible from the internet. Shielded nodes provide verifiable node identity and integrity but do not restrict internet access.
D is correct because deploying a standard private cluster and enabling shielded nodes would meet all the requirements. In a private cluster, nodes are not accessible from the internet by default. Enabling shielded nodes provides verifiable node identity and integrity. Additionally, following Google-recommended practices includes using standard clusters rather than autopilot clusters for more control and reducing operational costs.
Links:
https://cloud.google.com/kubernetes-engine
https://cloud.google.com/kubernetes-engine/docs/how-to/shielded-gke-nodes
Question 2:
You are using Cloud Shell to build and deploy your crypto-wallet webapp. Your webapp needs a custom utility to build and you need to make sure it is in the default execution path and persists across sessions. Where should you store it?
A. ~/bin
B. Cloud Storage
C. /cloud-shell/scripts
D. /usr/local/bin
Answer: A
Explanation:
A is correct because Cloud Shell provisions 5 GB of free persistent disk storage mounted as your $HOME directory on the virtual machine instance. This storage is on a per-user basis and is available across projects. Unlike the instance itself, this storage does not time out on inactivity. All files you store in your home directory, including installed software, scripts, and user configuration files like .bashrc and .vimrc, persist between sessions. Your $HOME directory is private to you and cannot be accessed by other users. The ~ sign signifies the home directory.
B is incorrect because you can store persistent data on Cloud Shells $HOME directory. There is no need to store the data in Cloud Storage.
C is incorrect because the /cloud-shell/scripts directory is not a valid directory.
D is incorrect because the /usr/local/bin directory is not writable.
Links:
https://cloud.google.com/shell/docs/how-cloud-shell-works
Question 3:
Multiple teams in your company are using Google Cloud for their applications. All teams use different billing accounts. Your leadership team has requested for a single dashboard where they can visualize the charges and make decisions. The dashboards should reflect new cost data as soon as possible. What can help you achieve this?
A. Populate all fields within the Pricing Calculator and generate an approximation of the monthly expenses using it
B. Access the Reports section within the Cloud Billing Console to review the specific cost details you‘re interested in
C. Navigate to the Cost Table page to obtain a CSV export, then employ Looker Studio to create visual representations
D. Set up Billing Data Export to BigQuery and utilize Looker Studio to visualize the data accordingly
Answer: D
Explanation:
A is incorrect because filling all resources in the Pricing Calculator will only give an estimate of the monthly cost and does not provide a single visual representation of all costs incurred.
B is incorrect because the Reports view in the Cloud Billing Console may provide cost information for individual projects, but it does not provide a single visual representation of all costs incurred across multiple projects.
C is incorrect because visiting the Cost Table page and exporting it as a CSV will only provide a static representation of the costs and does not provide a real-time or dynamic visual representation. Additionally, Looker Studio may not be the appropriate tool for visualizing the CSV data.
D is correct because configuring Billing Data Export to BigQuery allows for the automatic export of cost data from multiple projects to a single location. By visualizing the data in Looker Studio, you can have a real-time, dynamic representation of all costs incurred across multiple projects, giving a comprehensive overview for better estimating future charges.
Links:
https://cloud.google.com/billing/docs/how-to/export-data-bigquery
https://cloud.google.com/billing/docs/how-to/visualize-data
Question 4:
You have received some complaints from your users that they are experiencing high latency at random intervals in your app, hosted on Compute Engine. In order to check what is going on, your team needs to be monitoring the app at the time when the latency is high. What solution can you use on Google Cloud to notify your team if the latency is increased for 5 minutes?
A. Transmit Cloud Monitoring metrics into BigQuery and utilize a Looker Studio dashboard to track the latency of your web application
B. Develop an alert policy to trigger notifications when the HTTP response latency surpasses the predetermined threshold
C. Set up an App Engine service that interacts with the Cloud Monitoring API and sends notifications in instances of anomalies
D. Leverage the Cloud Monitoring dashboard to monitor latency and initiate appropriate measures upon detection of response latency surpassing the designated threshold
Answer: B
Explanation:
A is incorrect because exporting Cloud Monitoring metrics to BigQuery and using a Looker Studio dashboard would allow for monitoring of web application latency, but it does not provide an automated notification for the support team when high latency is detected. This solution would require additional development and configuration to implement the desired automated notification.
B is correct because creating an alert policy to send a notification when the HTTP response latency exceeds the specified threshold is a Google-recommended solution with no development cost. This solution directly addresses the requirement of automatically notifying the support team when high latency is experienced by users for at least 5 minutes.
C is incorrect because implementing an App Engine service to invoke the Cloud Monitoring API and send a notification in case of anomalies would also require additional development and configuration. This solution does not directly address the requirement of automatically notifying the support team when high latency is experienced by users.
D is incorrect because using the Cloud Monitoring dashboard to observe latency and taking the necessary actions when the response latency exceeds the specified threshold would require manual monitoring and intervention from the support team. This solution does not fulfill the requirement of automatically notifying the support team when high latency is experienced by users.
Links:
https://cloud.google.com/monitoring/alerts
Question 5:
Your team is modernizing a legacy application by leveraging Docker. What should you choose to deploy this application on Google Cloud such that the team does not need to manage infrastructure and the app can scale well if it gains popularity?
A. Create an instance template using the container image, then set up a Managed Instance Group that employs Autoscaling
B. Transfer Docker images to Artifact Registry, and proceed to deploy the application on Google Kubernetes Engine using the Standard mode
C. Store Docker images in Cloud Storage, and proceed to deploy the application on Google Kubernetes Engine using the Standard mode
D. Move Docker images to Artifact Registry, and carry out the deployment of the application on Cloud Run
Answer: D
Explanation:
A is incorrect because it suggests creating an instance template and deploying a Managed Instance Group with Autoscaling. This approach is more suitable for managing virtual machine instances, not containers.
B is incorrect because it suggests uploading Docker images to Artifact Registry and deploying the application on Google Kubernetes Engine using Standard mode. While this option involves using containers and Kubernetes for orchestration, it does not mention anything about automatic scaling based on popularity.
C is incorrect because it suggests uploading Docker images to Cloud Storage and deploying the application on Google Kubernetes Engine using Standard mode. Similar to option B, this option does not mention anything about automatic scaling based on popularity.
D is correct because it suggests uploading Docker images to Artifact Registry and deploying the application on Cloud Run. Cloud Run is a managed compute platform that automatically scales your containers based on incoming requests or events. This ensures that your application can scale automatically as it gains popularity, without the need to manage the underlying infrastructure.
Links:
https://cloud.google.com/artifact-registry
For a full set of 800 questions. Go to
https://skillcertpro.com/product/google-cloud-certified-associate-cloud-engineer-practice-exam-set/
SkillCertPro offers detailed explanations to each question which helps to understand the concepts better.
It is recommended to score above 85% in SkillCertPro exams before attempting a real exam.
SkillCertPro updates exam questions every 2 weeks.
You will get life time access and life time free updates
SkillCertPro assures 100% pass guarantee in first attempt.
Question 6:
Your E-commerce website is made up of 30 microservices. Each microservice has its own dedicated database backend. How should you store the credentials securely?
A. Store the credentials in the source code
B. Store the credentials in an environment variable
C. Store the credentials in a secret management system
D. Store the credentials in a config file that has restricted access through ACLs
Answer: C
Explanation:
A is incorrect because storing credentials in source code and source control is discoverable, in plain text, by anyone with access to the source code. This also introduces the requirement to update code and do a deployment each time the credentials are rotated.
B is incorrect because consistently populating environment variables would require the credentials to be available, in plain text, when the session is started.
C is correct because key management systems generate, use, rotate, encrypt, and destroy cryptographic keys and manage permissions to those keys. Sensitive credentials like database passwords should be handled with care.
D is incorrect because instead of managing access to the config file and updating manually as keys are rotated, it would be better to leverage a key management system. Additionally, there is an increased risk if the config file contains the credentials in plain text.
Links:
https://cloud.google.com/secret-manager
https://cloud.google.com/secret-manager/docs
Question 7:
You are maintaining a Google Cloud Project that is used for development purposes by your team. Your company has a dedicated Devops team that manages all Compute Engine instances in your company. How can you provide permissions to the Devops group in your company such that they have all administrative permissions to Compute Engine in your project but not have access to any other resources in the project?
A. Provide the DevOps team with the roles/viewer basic role and assign them the predefined roles/compute.admin role.
B. Develop an IAM policy that bestows the complete range of compute.instanceAdmin.* permissions. Associate this policy with the DevOps group.
C. Generate a unique role at the folder level, conferring all compute.instanceAdmin.* permissions upon it. Then, extend this custom role to the DevOps group.
D. Grant the DevOps group with the roles/editor basic role.
Answer: C
Explanation:
A is incorrect because it grants the DevOps group more permissions than necessary. The roles/viewer role provides read-only access, which is not required for the group. Additionally, roles/compute.admin gives the ability to create or update any Compute Engine resources, which goes against the requirement of not having permission to create or update other resources in the project.
B is incorrect because it grants the DevOps group all compute.instanceAdmin.* permissions, which again include more permissions than necessary. The group should only have full control of Compute Engine resources, not all compute-related permissions.
C is correct because it creates a custom role that specifically grants all compute.instanceAdmin.* permissions. This allows the DevOps group to have full control over Compute Engine resources without giving them excessive permissions for other resources in the project. Granting the custom role to the DevOps group ensures they have the necessary privileges.
D is incorrect because the roles/editor role provides excessive permissions that are not required for the DevOps group. It gives them the ability to manage all resources in the project, including Compute Engine, which goes against the requirement of not having permission to create or update other resources.
Links:
https://cloud.google.com/iam/docs/creating-custom-roles
Question 8:
Your company’s Data Science team is building a Dataflow job on Google Cloud to process large quantities of unstructured data in multiple file formats using the ETL process. What should you do to make the data accessible to the Dataflow job?
A. Transfer the data to BigQuery utilizing the bq command line utility
B. Store the data in Cloud Storage through the employment of the gcloud storage command
C. Load the data into Cloud SQL utilizing the import feature available in the Google Cloud console
D. Ingest the data into Cloud Spanner via the import capability accessible in the Google Cloud console
Answer: B
Explanation:
A is incorrect because the bq command line tool is used for uploading data directly into BigQuery, which may not be the most efficient option for handling large quantities of unstructured data in different file formats. Additionally, the question specifically asks for the data to be made accessible on Google Cloud so it can be processed by a Dataflow job, which suggests that uploading the data directly into BigQuery may not be the best solution for this scenario.
B is correct because uploading the data to Cloud Storage using the gcloud storage command allows for efficient handling of large quantities of unstructured data in different file formats. Cloud Storage is designed to store and manage objects, such as files, and provides high scalability, durability, and accessibility. By uploading the data to Cloud Storage, it can then be easily processed by a Dataflow job.
C is incorrect because uploading the data into Cloud SQL using the import function is not the optimal solution for handling large quantities of unstructured data. Cloud SQL is a fully-managed relational database service, designed for structured data, and may not be suitable for the file formats and unstructured nature of the data mentioned in the question.
D is incorrect because uploading the data into Cloud Spanner using the import function is also not the optimal solution for handling large quantities of unstructured data. Cloud Spanner is a globally distributed, horizontally scalable, and strongly consistent relational database service, which may not be the best fit for unstructured data.
Links:
https://cloud.google.com/sdk/gcloud/reference/storage
Question 9:
You have a CI/CD server running in your development environment. Now you need to implement the same for production. You have deployed a separate CI/CD server but it is failing to execute Google Cloud actions due to some permission issues. How do you make sure the service account has the correct roles in the production project?
A. Visit the Google Cloud console Identity and Access Management (IAM) section and verify the roles assigned to the service account at the project or inherited from the folder or organization levels
B. Check the organization policies in Google Cloud Console
C. Run a query to determine which resources this service account can access in Google Cloud Console
D. Run a query of the audit logs to find ‘permission denied‘ errors for this service account in Google Cloud Console
Answer: A
Explanation:
A is correct because checking the Identity and Access Management (IAM) roles assigned to the service account at the project or inherited from the folder or organization levels is the recommended way to verify the service account‘s permissions in the specific project.
B is incorrect because checking organization policies is not directly related to validating the service account‘s roles and permissions in the specific project.
C is incorrect because running a query to determine which resources the service account can access might help with troubleshooting, but it doesn‘t directly address the permission issues in the given project.
D is incorrect because querying audit logs to find permission-denied errors could be part of the troubleshooting process, but it doesn‘t directly validate the roles assigned to the service account in the project.
Links:
https://cloud.google.com/iam/docs/overview
Question 10:
Your E-commerce website is gaining popularity all over the globe. As the number of users accessing your site increases, you need to make sure that the website’s performance remains consistent. Your users are accessing the website from across the world and you need to keep management complexity to a minimum. Which Google Cloud product should you choose?
A. Cloud SQL database with cross-region replication
B. Cloud Spanner
C. BigQuery with a Redis on Memorystore instance in the front
D. Bigtable database partitioned by username
Answer: B
Explanation:
A is incorrect because Cloud SQL may provide replication and availability across regions, but it may not guarantee the same level of performance and consistency as Cloud Spanner.
B is correct because Cloud Spanner is a globally distributed relational database that provides strong consistency and high availability. It is designed to scale horizontally across multiple regions while maintaining transactional consistency, making it a suitable choice for ensuring optimal performance for global users without increasing management complexity.
C is incorrect because BigQuery with Redis on Memorystore may offer scalability and global consistency through Redis caching, but it is primarily designed for analytical queries rather than real-time transactional operations.
D is incorrect because Bigtable partitioned by username may provide scalability and performance for key-value storage, but it may not offer the same level of transactional consistency and global availability as Cloud Spanner.
Links:
https://cloud.google.com/spanner
https://cloud.google.com/sql
https://cloud.google.com/bigquery
https://cloud.google.com/bigtable
For a full set of 800 questions. Go to
https://skillcertpro.com/product/google-cloud-certified-associate-cloud-engineer-practice-exam-set/
SkillCertPro offers detailed explanations to each question which helps to understand the concepts better.
It is recommended to score above 85% in SkillCertPro exams before attempting a real exam.
SkillCertPro updates exam questions every 2 weeks.
You will get life time access and life time free updates
SkillCertPro assures 100% pass guarantee in first attempt.