You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I think having a database where we store accessed user info, and tracking the generation time for those rows in order to only perform API requests once or twice a day per user could greatly increase our performance, and prevent API rate limit issues. @Strikingwolf what do you think?
User table:
PRIMARY KEY
Username | Avatar | Current Streak | Longest Streak | Total Year | Average Day | Average Week | Average Month | Followers | Following | Public Repos | Private Repos | Forks | Mirrors | Repos
String | String | Integer | Integer | Integer | Integer | Integer | Integer | Integer | Integer | Integer | Integer | Integer | Integer | Pipe sep Strings
Repository table:
PRIMARY KEY
User/Repo | Open Issues | Closed Issues | Open Pulls | Merged Pulls | Closed Pulls | Forks | Stargazers | Watchers | Language Data
String | Integer | Integer | Integer | Integer | Integer | Integer | Integer | Integer | Hash as String
The text was updated successfully, but these errors were encountered:
I was reading about some database development stuff in pure Sinatra today, and was thinking of going with that. It won't be done until after #4 though, so it's still a while away.
@Strikingwolf How do you suppose we'll save the actual language data? Right now in the main post I have it as a string that represents the main hash, but I hate that.
I think having a database where we store accessed user info, and tracking the generation time for those rows in order to only perform API requests once or twice a day per user could greatly increase our performance, and prevent API rate limit issues. @Strikingwolf what do you think?
User table:
Repository table:
The text was updated successfully, but these errors were encountered: