Is it possible to use SetExtractedValue or an equivalent/alternative method available in teststudio mobile?
We used that method in Teststudio web to verify entered data in later steps. We've checked the documentation but couldn't find anything like it.
In this case we want to create a test for a webshop. In this test we add an article to our shoppingcart, in every step in the order process we want to verify that the article has the same price and amount.
Martien van Steenbergen
In my test scenario, I am logging into various applications with different usernames and passwords. Test scripts are bound to external data source (xml file) which stores each application credentials.
I don't want to explore these details in the test/test list results, but currently everything in the data source file is logged. Please find a solution.
Explore the possibilities to integrate with third party email services like SendGrid - this uses a username which is not a valid email. Thus the current implementation of the SMTP settings in the Scheduling configuration cannot be used to send an email.
Often in our web app a lot of logs, reports or forms are downloaded through Step Builder "Handle 'Download' dialog".
We would like to compare the downloaded file (xls, xlsx, pdf are mostly used) to a baseline file so that we can verify the downloaded file has the same content as expected result.
This definitely can be achieved through "Coded Step" but it will be great if there is a built in function to compare and verify file, similar to how to verify image.
Have been facing problem in performing data driven test for Kendo drop down list on web app but a lot of posts on forums suggested that issue is already fixed.
Only now to notice that the Kendo UI translators available in Test Studio now are for Angular.
But in our web app, Kendo for jQuery is used.
Would like to request for jQuery Kendo UI translator (jQuery Kendo Drop Down mostly used) as well so that we will be able to perform data driven selection on jQuery Kendo drop down list.
Currently the OAuth 2.0 support is limited to the following grant types: password, client_credentials and refresh_token.
It will be useful to add support for AuthorisationCode grant and Implicit grant as well.
Is it possible to have the Test List result in Excel format to be as detailed as HTML format?
Is there any particular reason for the results in Excel format to be so simplified?
Management level would prefer to be able to filter results conveniently in Excel meanwhile the results reporting in Excel is overly summarized.
It would be great if we are able to view which steps fail with details in Excel format too.
I ran a few tests in a test list in Test Studio Mobile, and noticed, there is no report. It is just a test list which contains passed or failed test cases. Is there a reporting feature similar to the one for Web?
Also, I would like to export the results of the test list as a part of documentation, having run a set of test cases. Does the functionality exist in Test Studio Mobile? I did not find anything matching these requirements in the documentation.
How can I show a report of the tests which have run? And export them?
Please let me know
Similar concept to the .sln file that is created per Visual Studios projects.
It would be nice if there was a single file for Test projects that would open the entire project inside of Telerik Test studio, instead of having to open Test studio first and then navigating to the project and selecting "Select Folder".
Please add the ability to expand the suggested options section that appears along the bottom of the Edit Elements tab as it would be nice to see more than four options at once without scrolling through a space that is less than 1.5 inches tall. In the old view, nearly all of the options were visible without scrolling. See attached screenshot if needed.
Thank you in advance for your consideration.
Telerik does not support test studio assemblies in .net standard and does not have a test runner in .net core. Considering all of our Azure DevOps and container agents are running linux we will not be able to use the tool.
Please add this feature to your product.
It would be useful to have a method where you could specify global/environmental variables/parameters that is separate and independent from the data driven feature. These global parameters would be avalaible to all tests using the Data Driven interface of binding values to test steps, but would not be mingled in the data source itself. This way you can create a data driven test that runs iterations but keep the environmental static data e.g. userid's to login as, separate from the data itself.
Email body of report is not detailed just like attached HTML. So that leadership team can view the failure steps from the email without downloading the attachment