Skip to main content

Key Terms

Let us establish some key terms and concepts to build a foundation of how Latency Lingo works. Come back to review this page if you find yourself struggling to understand concepts in our documentation.

Concepts

  • Latency Lingo CLI: The tool to publish web performance test metrics to the Latency Lingo platform, so we can provide you with performance test reports. Intended to be used by software testers that run the performance tests.
  • latencylingo.com: The user interface to analyze your performance test reports and collaborate with your team.
  • Web performance tests: A test that measures the performance of a web application under certain conditions. Often referred to as a load test.
  • Load test runner: A tool that software testers use to run web performance tests. We're big fans of JMeter, K6, Gatling, and Locust.
  • Performance test scenario: A test plan that is used to run a performance test.
  • Performance test operations: The different types of request included in a performance test scenario.
  • Performance test run: The execution of a test scenario.
  • Performance test report: A report that contains the results of a web performance test run. Display metrics, observations, and other information about the run.
  • Performance test metric: A metric that is measured by a performance test. Metrics that we cover include request count, error count, response times (avg, min, max, p50, p75, p90, p95, p99), and virtual users.
  • Performance test observation: A comment regarding some aspect of the performance test run.

Features

  • Test Runs: Details and visualizations for a performance test run. Analyze the metrics, capture observations, declare action items. Share the test run report with your team.
  • Dashboard: A dashboard that displays all performance test scenarios and runs for your team. This is where you can get an overview of your performance test results and dive into specific runs.