Please add the browser name and version to the excel sheet generated by the schedule notification email.
During recording, images are captured of the screens that are active. When these images are viewed after the recording is made, the level of resolution is pretty low making it difficult to make out objects on the captured pages. This is also the case when a failure occurs in that the failure image is of pretty low resolution, too, making it difficult to do comparison between the original recording image and the failure image. I would like for the tester to be able to choose the level of resolution that they want captured during the recording and full resolution at point of failure.
I'd really like to be able to use relative paths in WPF testing as we have different workspace definitions in our project and we have the directory context start in the root directory of the test project.
currently Telerik UI is not allowing user to user And OR logic in IF statement. if this is allowed from UI then User dont have to write code whenever they need to have two or multiple condition in IF statement
Please make the default browser in Test List settings to reflect the selected preferred browser. Currently the default executing browser for a test list is set to IE by default. Refer to the screenshot attached.
When someone wants to use Extracted data to data drive Find Expressions, they have to do the following: Set up a local Data Source Extract the data from the input field Then use the extracted variable in a Data Driven Find Expression to locate the value and verify it. The problem is that a datasource is necessary for datadriven find expression to populate. Without a local or bound datasource, datadriven find expressions aren't available. This means that extraced variables can't be used. -- Request - Allow data driven find expressions to be available without having to bind to a data source. Then find expressions will be able to bind to a variable if a variable is available.
We have all the recorded objects inside a silverlight iframe. Now our applications have changed and we don't have Iframe anymore. But existing scripts are not working as it throws error that silverlight/contentframe is missing.
We use Telerik Test Studio with team explorer 2013 and the integration with tfs. We have a problem with our check in to source control. Our project uses a gated-check in policy. This policy changes the tfs check in process. All changes need to be verified on the server before check in. This policy is only necessary for the work of the developers. Our testers may ignore this policy. This is not possible in Telerik. The testers now use Visual Studio for their Telerik check ins, but it is probably also possible through the integration that Telerik uses. The parameters are /noprompt en /bypass for the tf.exe command. We were wondering if it is possible for you to change this to connect TFS and Telerik.
It would be useful to be able to schedule a test list to run between 2 times (for instance when an system is available). For example, I would like to run Test list A hourly between 9am and 6pm
Sometimes it would be nice to call a custom function before or after each individual test step. Adding additional on before test step and on after test step hooks into the execution extension would be useful here. The main use would be in the on after test step hook to wait for some global value or variable to be set or condition to be true before moving on to the next test step. Source ticket 993986
HtmlOrderedList & HtmlUnorderedList classes have extended properties that would be useful in quick steps. Specifically Items and AllItems. There is value in being able to add a quick verification for Items.Count. Currently there is no translator in the recorder for these classes. They can be used in code, but it's not obvious or clear how to find an element and cast it to the correct type. It gets even more confusing if you use Add Element in the recorder because it's add as an "HtmlControl". You have to cast it first to a BaseElment then to a HtmlOrderedList before you can take advantage of the extended properties.
I would like to be able to provide each test list a separate description, as it is for the Tests in the Test Properties now.
On the Results tab for both report and calendar view, it would be helpful to know what the BaseURL was for each test. We often run the same test list against different applications by simply changing the BaseURL value for that list. As such it would be helpful to see that data at a glance on the calendar views and the report view on the Results tab.
It would be nice to have a "fix all" button when the "Used by" context menu displays broken test as step references. We have many tests that are used as steps in hundreds of other tests so if the reference gets messed up, it's rather lengthy clicking on OK hundreds of times.
Currently if the name is changed on a data source, you have to manually update each test that is bound to that source. Since there is no easy way to see all the tests that are bound to a source, it would be nice for those changes to update automatically.
When exporting a load test result to Excel, it would be nice if there were individual worksheets in the workbook corresponding to the individual server counters that are configured in the load test.
It would be VERY nice to be able to somehow export the “Analyze your results” area of a load test so that the graphical counter data can be viewed outside of Test Studio (maybe by the lightweight Telerik.TestStudio.ResultsViewer.exe file). This would make it much easier to share the high level load test results with executives and other interest parties who are not users of Test Studio.
It would be nice to be able to import/export load test counter templates as used in the “Monitor Performance” section of a load test. This way we could easily update the templates used across multiple load tests and multiple Test Studio projects.
It would be VERY nice to be able to use a distinct data source per load test user profile rather than one query for the entire load test. We have a scenario where we would like to be able to test variations of load based on percentage of user types that are using the application. Each user type would be a separate user profile in the load test and we could adjust the workload accordingly. In order for the load test to use multiple logins and search terms, we would need each user profile to be data bound to a different query or data source. This is because each user type accesses different dashboards in the application which may affect the performance differently, and we’d like to be able to see how different user type workload percentage configurations affect the load on the servers.
It would be nice to be able to import multiple existing tests into a new project by folder/subfolder rather than one at a time. This way Test Studio could recreate a complex folder structure easily and import tests into the proper folder. Import should properly handle all "tests as step" scenarios so that the UniqueID values get updated accordingly.