Under Review
Last Updated: 29 Oct 2020 16:18 by Matt
Created by: Omid
Comments: 3
Type: Feature Request
16

Telerik does not support test studio assemblies in .net standard and does not have a test runner in .net core. Considering all of our Azure DevOps and container agents are running linux we will not be able to use the tool.

Please add this feature to your product.

Under Review
Last Updated: 28 Sep 2020 14:44 by Stephane
Created by: Allen
Comments: 8
Type: Feature Request
13
It would be nice to have the story board capture subtest screenshots instead of the "Test Step" placeholder.  It would be fine maybe to watermark the screenshots with "Test Step," but still show the screens for reference.
Under Review
Last Updated: 15 Sep 2020 13:50 by ADMIN

Not a big deal, or anything requiring further support, but an observation worth noting nonetheless:

If a dynamic target is set between 2 steps and one such step is removed from HTTP traffic, rather than being able to edit the target to reflect a new destination step (or rather than the target information simply passing to step taking the place of the deleted step in the HTTP order), the target destination defaults to zero and any attempt to edit that traffic or the dynamic target further results in the application crashing. 

Also of note, the reason that this has been an issue, is that deleted traffic seems to be reappearing on application close-->reopen, even when the traffic is saved (with or without the crashing behavior). 

Hope this helps! Thanks. 

Under Review
Last Updated: 18 Aug 2020 07:49 by ADMIN

Hello,

ResultsViewer crashes when closing the Step Failure Details window with the OK button.

To reproduce:

  1. Open Trends_UserDefined 132421213805726776.aiiresult
  2. Double-click on Trends_UserDefined 17/08/2020 09:00
  3. Double-click on Trends_UserDefined.tstest
  4. Double-click on any failed step
  5. Click on Ok button

 

Best regards,
Alexandre.

Under Review
Last Updated: 06 Jul 2020 15:03 by ADMIN

Using the Telerik.ApiTesting.Runner.exe to execute API tests cannot output results in junitstep format. It throws an error if using the -f junitstep option when running tests or test suite:

[ERROR] Not supported test results format
Parameter name: junitstep
Under Review
Last Updated: 23 Jun 2020 09:52 by ADMIN

Dear Support,

The version I am using is not listed below: 2020.1.403.0

I am noticing an inconsistency when using the Replace Element feature i.e. the attributes selected are not respecting the priority set in the Settings - Find Logic (Html) screen.

Please refer to screenshots attached.

Under Review
Last Updated: 12 May 2020 11:43 by ADMIN

See the screen shot.

There are two annotation markers. For some reason, a "MouseClick:LeftClick" will appear, as it should, but then it sticks there, even as 10 or 12 other steps execute and their annotations come and go. Sometimes another MouseClick:LeftClick will appear and the old one will go away and then the new one will stick. But this does not happen every times. Sometimes a new MouseClick:LeftClick annotation comes in and goes away and the old one is still hanging around.

This happens will almost all of my scripts.

Is it because some of the mouse clicks are from recorded steps and some are from code steps? This is just a guess.

Under Review
Last Updated: 07 May 2020 10:39 by ADMIN

Hi guys,

We're using Telerik Testing Framework for automation and running tests primarily against IE browser and since we have security policies enforced on clients, tests started to fail more often with ApplicationException on start.

After some investigation we found that in LaunchNewBrowserInstance() function in Core.Manager has hardcoded timeout to 5000, which is not enough now for IE to respond. As you can see in the code below, everything is using 'timeout' variable except 'Connector.Attach(ref handle, 5000);' - it uses hardcoded timeout to wait:

internal static object LaunchNewBrowserInstance(
      int timeout,
      ProcessWindowStyle windowStyle,
      string pipename,
      string url)
    {
      string str = string.IsNullOrEmpty(url) ? "about:blank" : url;
      try
      {
        ProcessStartInfo startInfo = new ProcessStartInfo()
        {
          Arguments = (InternetExplorerActions.MajorVersion >= 8 ? "-nomerge " : string.Empty) + str,
          Verb = "open",
          WindowStyle = windowStyle,
          ErrorDialog = false,
          FileName = "iexplore.exe"
        };
        System.Diagnostics.Process process = System.Diagnostics.Process.Start(startInfo);
        TraceInfo.Framework.ReportProcessLaunched(process, startInfo);
        IntPtr handle = InternetExplorerActions.WaitForIEFrameFromProcess(process, timeout);
        TraceInfo.Framework.WriteLine("Attempting to attach on IE frame (HWND={0})...", (object) handle);
        Connector.Attach(ref handle, 5000);
        System.Diagnostics.Process currentProcess = System.Diagnostics.Process.GetCurrentProcess();
        int num = currentProcess.Id;
        if (currentProcess.ProcessName == "ArtOfTest.Runner")
          num = InternetExplorerActions.SafeGetParentOrCurrentId(currentProcess);
        Connector.InjectCode(handle, InternetExplorerActions.ArtOfTestPlugin, pipename, num.ToString(), true, timeout, "");
        return (object) null;
      }
      catch (Exception ex)
      {
        throw new ApplicationException("Exception thrown attempting to launch Internet Explorer. Please make sure Internet Explorer is properly installed and you are able to launch it.", ex);
      }
    }

This is likely the cause, why our increased timeout settings are ignored and test fails shortly after start, even if we increase timeouts significantly. Visually, it looks like IE is doing some background job and is not responsive for some period of time and 5 seconds is not enough to wait. So, could you change the code to respect 'timeout' instead of using constant number?

We've been using Telerik Testing Framework for a long time already and from time to time had this issue, but workaround it by catching exception, killing browser and trying to launch it again. But now that trick is not helping anymore, the majority of tests fails with this exception on start.

This fix is really simple and will save us a lot of time, trying to invent some new solution to make IE more responsive on start.

 

Thanks,

Oleksii

Under Review
Last Updated: 22 Apr 2020 13:27 by ADMIN
Created by: Piyush
Comments: 2
Type: Feature Request
2
Add dark and light themes in Test Studio product that can be switched at any time.
Under Review
Last Updated: 15 Apr 2020 10:33 by ADMIN

On a data driven web test, setting the DataRange property to the example "SingleRow" causes Visual Studio 2017 to crash.  I have three rows of data in a data table that are bound to specific steps of my test.  I was attempting to limit execution to only one row temporarily. Using SingleRow or 'SingleRow' causes Visual Studio 2017 to crash with the following stack trace as found in Windows Event log.  I was able to successfully use '1:1' or ':1' as a work around.

Application: devenv.exe
Framework Version: v4.0.30319
Description: The process was terminated due to an unhandled exception.
Exception Info: System.ArgumentException
   at ArtOfTest.Common.Design.ProjectModel.TestBase.set_DataRange(System.String)
   at ArtOfTest.Common.Design.ProjectModel.TestBase.SetUserProperties(ArtOfTest.Common.Design.ProjectModel.TestBaseUserProperties)
   at ArtOfTest.WebAii.Design.ProjectModel.Test.SetUserProperties(ArtOfTest.Common.Design.ProjectModel.TestBaseUserProperties)
   at Telerik.TestStudio.Web.Toolbar.previewTestProperties_Click(System.Object, System.EventArgs)
   at System.Windows.Forms.ToolStripItem.RaiseEvent(System.Object, System.EventArgs)
   at System.Windows.Forms.ToolStripButton.OnClick(System.EventArgs)
   at System.Windows.Forms.ToolStripItem.HandleClick(System.EventArgs)
   at System.Windows.Forms.ToolStripItem.HandleMouseUp(System.Windows.Forms.MouseEventArgs)
   at System.Windows.Forms.ToolStripItem.FireEventInteractive(System.EventArgs, System.Windows.Forms.ToolStripItemEventType)
   at System.Windows.Forms.ToolStripItem.FireEvent(System.EventArgs, System.Windows.Forms.ToolStripItemEventType)
   at System.Windows.Forms.ToolStrip.OnMouseUp(System.Windows.Forms.MouseEventArgs)
   at System.Windows.Forms.Control.WmMouseUp(System.Windows.Forms.Message ByRef, System.Windows.Forms.MouseButtons, Int32)
   at System.Windows.Forms.Control.WndProc(System.Windows.Forms.Message ByRef)
   at System.Windows.Forms.ScrollableControl.WndProc(System.Windows.Forms.Message ByRef)
   at System.Windows.Forms.ToolStrip.WndProc(System.Windows.Forms.Message ByRef)
   at ArtOfTest.WebAii.Design.UI.VsToolStrip.WndProc(System.Windows.Forms.Message ByRef)
   at System.Windows.Forms.Control+ControlNativeWindow.OnMessage(System.Windows.Forms.Message ByRef)
   at System.Windows.Forms.Control+ControlNativeWindow.WndProc(System.Windows.Forms.Message ByRef)
   at System.Windows.Forms.NativeWindow.DebuggableCallback(IntPtr, Int32, IntPtr, IntPtr)
Under Review
Last Updated: 06 Apr 2020 08:50 by ADMIN
Created by: Larry
Comments: 0
Type: Bug Report
1
The "Description" column is missing, when I export the test results from the Results tab in Excel. This was previously working on Test Studio 2019.1 and earlier versions. It looks to be a regression, so please take a look and resolve it.
Under Review
Last Updated: 23 Sep 2019 07:53 by ADMIN

Often in our web app a lot of logs, reports or forms are downloaded through Step Builder "Handle 'Download' dialog".

We would like to compare the downloaded file (xls, xlsx, pdf are mostly used) to a baseline file so that we can verify the downloaded file has the same content as expected result.

This definitely can be achieved through "Coded Step" but it will be great if there is a built in function to compare and verify file, similar to how to verify image. 

Under Review
Last Updated: 23 Sep 2019 07:29 by ADMIN

Hi,

Is it possible to have the Test List result in Excel format to be as detailed as HTML format?

Is there any particular reason for the results in Excel format to be so simplified?

Management level would prefer to be able to filter results conveniently in Excel meanwhile the results reporting in Excel is overly summarized.

It would be great if we are able to view which steps fail with details in Excel format too.

Under Review
Last Updated: 26 Jun 2019 08:29 by ADMIN

Hi ,

 

We are using telerik testing framework for Automation testing. Currently we are executing our scripts in a client VM using TFS builds. Now we would like transform to Docker Execution. We couldn't get the direct solution from any forum for Telerik Testing framework and Docker integration. So please help us on identifying the docker image for Telerik testing framework, so that we will make use of it and we will create a container for the same to run our test scripts in docker vm. Any websites/links related to this topic would be helpful for us.