-
Notifications
You must be signed in to change notification settings - Fork 33
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Feature request: Store results of performance tests somehow and somewhere for the future usage (e.g. performance regression testing on CI) #24
Comments
This project is great – and this idea will make it even better! I've been reviewing the codebase and from my understanding:
As per my understanding, there isn't yet a way to create a full baseline from a CI environment (eg. on Is that planned as a future step OR am I misunderstanding the overall intention/workflow? Thanks! |
Hi @juanferreras yes the CLI is definitely a WIP even though we've cut a 1.0.0 I expect we'll build it out further. Given we're still tightly bound to storybook, you're right that creating a baseline in a CI environment requires a headless browser and a driver to run and collect the metrics. Internally at @atlassian we do have an implementation of some of this functionality but it's quite opinionated to our pipelines. This is something we're definitely exploring though although I can't give you any timelines. For the time being we'll look to improve the documentation of the CLI to make it simpler to get better use out of it. |
We also need to update documentation around file/save functionality which is very useful for regression testing. For anyone watching this issue as of |
Hey y'all. I noticed that this is still undocumented. I'm happy (if not eager) to contribute but was curious what the state of this effort is currently. Whether y'all might want some help with the cli itself and/or just the docs, I may be able to help. Really appreciate all the work done so far (on the CLI and otherwise), it's great! |
@DarkPurple141 would love hints as to what you're doing once the storybook is open in a headless browser. Do you have it interacting with the storybook UI to run tests, save results, etc.? Are you exposing ways of running that stuff programmatically once the storybook is open? |
Hey @joshacheson I'm going to work on the documentation for the storage/cli this weekend. Thanks for the nudge! I'll provide some insight into how Atlassian does the CI setup (which is still pretty rough), in the docs. Certainly open to thinking about better ways to run these tests with test runners or perhaps even in a node environment as an indicative approach! Let me write it up first though. |
@DarkPurple141 Is there anything I can help here to "make it happen"? I know that it's more than 2 years since the last activity here, but I'm eager to help 🙇 |
Hey 👋
What a great idea for an addon! It reminds me how Stripe was doing performance testing of their components several years ago. They've found that testing performance of specific components have much more value than testing the whole pages with tons of components.
I know that the storybook addons live only in the browser so the idea that I want to propose exceeds the browser boundaries, but... at least it's worth to propose it and generally I'd like to start a discussion about it.
Idea
Store "somehow" and "somewhere" performance results to have automatic regression testing that could be run on CI.
It may be stored by using an API? Maybe tools like Graphite? Even it could be saved like
jest's snapshot tests
, locally as a file with data in a specific format.Do you plan to work on something like that in the future in
storybook-addon-performance
or as a separate project? What do you think?The text was updated successfully, but these errors were encountered: