Which two project situations favor a waterfall methodology? Choose 2 answers
A. An application with many systems and inter-dependencies between components.
B. An application with regulatory compliance requirements to be validated by outside agencies.
C. An application in post-production, with incremental changes made by a small team.
D. An in-house application with a fixed team size, but an open timeline and flexible requirements.
Explanation:
An application with many systems and inter-dependencies between components is a project situation that favors a waterfall methodology, as it requires a high level of planning and design upfront to ensure the integration and compatibility of the components. An application with regulatory compliance requirements to be validated by outside agencies is also a project situation that favors a waterfall methodology, as it requires a clear and detailed documentation of the requirements and specifications, as well as a formal and rigorous testing and validation process. An application in post-production, with incremental changes made by a small team is a project situation that favors an agile methodology, as it allows for faster and more frequent delivery of changes, as well as more flexibility and collaboration. An in-house application with a fixed team size, but an open timeline and flexible requirements is also a project situation that favors an agile methodology, as it allows for more creativity and experimentation, as well as more feedback and adaptation.
What are three advantages of using the SFDX? Choose 3 answers
A. Can store code on a local machine, or a version control system.
B. Can quickly deploy metadata using Execute Anonymous.
C. Can create scratch orgs.
D. Can use native Deployment Rollback Tool to quickly revert to prior state.
E. Can Install application metadata from a central repository.
Explanation:
Three advantages of using the SFDX are: can store code on a local machine, or a version control system; can create scratch orgs; and can install application metadata from a central repository. These advantages can help improve the development experience, as they allow developers to work with source-driven development, use ephemeral and configurable environments, and access metadata components from a shared location. Can quickly deploy metadata using Execute Anonymous is not an advantage of using the SFDX, as Execute Anonymous is a feature of the Developer Console that allows running Apex code, not deploying metadata. Can use native Deployment Rollback Tool to quickly revert to prior state is also not an advantage of using the SFDX, as there is no such tool in the SFDX. See [Salesforce DX] for more details.
Sales and Service products will be created by two teams that will use second-generation managed package(s). The Sales team will use a specific function of the Service product, but the architect wants to ensure that this team will only use the functions exposed by the Service team. No other team will use these same functions. What should an architect recommend?
A. Create two second generation managed packages with the same namespace and set the methods that should be shared with the @namespaceAccessible annotation.
B. Create two managed packages with Sales and service namespaces. Set the methods to be shared with the ©salesAccessible annotation
C. Create a managed package with both products and create a code review process with an approver from each team.
D. Create two managed packages. Create an authentication function in the Service package that will return a token if a Sales user is authorized to call the exposed function. Validate the token in the Service functions.
Explanation:
The architect should recommend creating two second generation managed packages with the same namespace and setting the methods that should be shared with the @namespaceAccessible annotation. This will allow the Sales team to access the specific functions of the Service product without exposing them to other teams or customers. Creating two managed packages with different namespaces will not allow the Sales team to access the Service functions, unless they are declared as global, which will expose them to everyone. Creating a managed package with both products will not allow the separation of the products and the control of the functions. Creating an authentication function in the Service package will add unnecessary complexity and overhead to the solution.
Metadata API supports deploy () and retrieve () calls for file-based deployment. Which two scenarios are the primary use cases for writing code to call retrieve () and deploy () methods directly? Choose 2 answers
A. Team development of an application in a Developer Edition organization. After completing development and testing, the application is Distributed via Lightning Platform AppExchange.
B. Development of a custom application in a scratch org. After completing development and testing, the application is then deployed into an upper sandbox using Salesforce CLI(SFDX)
C. Development of a customization in a sandbox organization. The deployment team then utilize the Ant Migration Tool to deploy the customization to an upper sandbox for testing.
D. Development of a custom application in a sandbox organization. After completing development and testing, the application is then deployed into a production organization using Metadata API.
Explanation:
The Metadata API is mainly used for file-based deployment, such as deploying an application from a Developer Edition org to the AppExchange, or from a sandbox org to a production org. The Ant Migration Tool is a wrapper around the Metadata API, so it is not a direct use case for writing code to call retrieve() and deploy() methods. The Salesforce CLI (SFDX) uses the Source-Driven Development model, which relies on the source code as the source of truth, rather than the Metadata API.
Ursa Major Solar (UMS) has used Aura components significantly in its Salesforce application development. UMS has established a robust test framework and the development team follows the Salesforce recommended testing practices. UMS team uses Salesforce’s test tool To check for common accessibility issues. In which two environments the UMS team can call Aura accessibility tests? Choose 2 answers
A. JSTEST
B. ACCTEST
C. WebDriver Test
D. AuraDriver Test
Explanation:
Aura accessibility tests can be called in JSTEST and WebDriver Test environments. JSTEST is a JavaScript testing framework that runs on Node.js and can be used to test Aura components. WebDriver Test is a Selenium-based testing framework that can be used to test the user interface and accessibility of Aura components. ACCTEST and AuraDriver Test are not valid environments for calling Aura accessibility tests.
As a part of technical debt cleanup project, a large list of metadata components has been identified by the business analysts at Universal Containers for removal from the Salesforce org. How should an Architect manage these deletions across sandbox environments and production with minimal impact on other work streams?
A. Generate a destructivechanges.xml file and deploy the package via the Force.com Migration Tool
B. Perform deletes manually in a sandbox and then deploy a Change Set to production
C. Assign business analysts to perform the deletes and split up the work between them
D. Delete the components in production and then refresh all sandboxes to receive the changes
Explanation:
A is the correct answer, as generating a destructivechanges.xml file and deploying the package via the Force.com Migration Tool is the best way to manage the deletions of metadata components across sandbox environments and production with minimal impact on other work streams. A destructivechanges.xml file is a special file that specifies the components to be deleted from an org, and can be deployed using the Force.com Migration Tool, which is a command-line tool that uses the Metadata API to retrieve and deploy metadata components. This method can help to automate and streamline the deletion process, as well as ensure consistency and accuracy across the environments. B is incorrect, as performing deletes manually in a sandbox and then deploying a change set to production is not a good way to manage the deletions, as it can introduce errors and inconsistencies, as well as require additional steps and permissions. C is incorrect, as assigning business analysts to perform the deletes and splitting up the work between them is not a good way to manage the deletions, as it can create confusion and complexity, as well as lack of coordination and integration.
D is incorrect, as deleting the components in production and then refreshing all sandboxes to receive the changes is not a good way to manage the deletions, as it can disrupt the production environment and the ongoing development and testing activities in the sandboxes. You can learn more about this topic in the Deploy Changes with the Force.com Migration Tool unit on Trailhead.
Which are the two key benefits of fully integrating an agile issue tracker with software testing and continuous integration tools? Choose 2 answers?
A. Developers can see automated test statuses that commit on a specific user story.
B. Developers can collaborate and communicate effectively on specific user stories.
C. Developers can observe their team velocity on the burn chart report in the agile tool.
D. Developers can use the committed code's build status directly on the user story record.
Explanation:
Integrating an agile issue tracker with software testing and continuous integration tools can provide the following benefits:
Developers can see automated test statuses that commit on a specific user story, which can help them identify and fix any errors or failures quickly.
Developers can use the committed code’s build status directly on the user story record, which can help them track the progress and quality of their work.
When replacing an old legacy system with Salesforce, which two strategies should the plan consider to mitigate the risks associated with migrating data from the legacy system to Salesforec? Choose 2 answers?
A. Identify the data relevant to the new system, including dependencies, and develop a plan/scripts for verification of data integrity.
B. Migrate users in phases based on their functions, requiring parallel use of legacy system and Salesforce for certain period of time.
C. Use a full sandbox environment for all the systems involved, a full deployment plan with test data generation scripts, and full testing including integrations.
D. Use a full sandbox environment and perform test runs of data migration scripts/processes with real data from the legacy system.
Explanation:
Identifying the relevant data and verifying the data integrity can help ensure the quality and accuracy of the migrated data. Using a full sandbox and performing test runs with real data can help validate the migration process and identify any issues or risks.
A technical lead is performing all code reviews for a team and is finding many errors and improvement points. This is delaying the team’s Deliveries. Which two actions can effectively contribute to the quality and agility of the team? Choose 2 answers
A. Choose the most senior developer to help the technical lead in the code review.
B. Create development standards and train teams in those standards.
C. Skip the code review and focus on functional tests and UAT.
D. Use static code analysis tool in the pipeline before manual code review.
Explanation:
The two actions that can effectively contribute to the quality and agility of the team are: Define and follow code standards, and use static code analysis tool in the pipeline before manual code review. Code standards can help ensure consistency, readability, and maintainability of the code, as well as reduce errors and bugs. A static code analysis tool can help automate the code review process and identify any issues or violations of the code standards before the manual review. Choosing the most senior developer to help the technical lead or skipping the code review are not effective actions, as they can lead to more errors and delays.
There are many types of quality assurance techniques that can help minimize defects in software projects. Which two techniques should an architect recommend, for Universal Containers to incorporate into its overall CI/CD pipeline? Choose 2 answers
A. Business verification testing
B. Stress testing
C. Automated browser testing
D. Static code quality analysis
Explanation:
Automated browser testing and static code quality analysis are two quality assurance techniques that can help minimize defects in software projects, and that an architect should recommend for Universal Containers to incorporate into its overall CI/CD pipeline. Automated browser testing is a technique that involves using tools or frameworks to simulate user interactions with the web application across different browsers and devices, and to verify the functionality and performance of the application. Static code quality analysis is a technique that involves using tools or frameworks to scan the code and detect any violations of the predefined coding rules and best practices, such as syntax errors, security issues, code smells, etc. Business verification testing and stress testing are also quality assurance techniques, but they are not as suitable or relevant for the CI/CD pipeline, as they are more focused on validating the business requirements and the system capacity.
Page 8 out of 23 Pages |
Previous |