Jump to content

Search the Community

Showing results for tags 'review of reviews'.

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Forums

  • MEMBER COMMUNITY
    • Introduce Yourself
    • What's In Your Bag?
    • General Q&A
    • Arrange a game with fellow Spies
    • Contests
    • Buy/Sell/Trade
    • Spy Chat
    • MGS User Groups
  • PRODUCT REVIEWS
    • Official Forum Member Reviews
    • Unofficial Reviews
    • Past Testing Opportunities
    • Testers Wanted
  • THE EQUIPMENT
    • Spy News
    • Boom Sticks
    • Shiny Metal
    • Putter Buzz
    • Balls
    • Shafts
    • Bags & Carts
    • Tech Talk
    • Fashion & Style
    • Japanese Equipment
    • General Equipment Talk
  • TOUR TALK
    • PGA Tour Events
    • Other Tour Events
    • News and Debate
    • Golf Spies in Competition
  • MYGOLFSPY UNIVERSITY
    • Lessons, Drills, Mental Game, and Fitness Tips
    • Training Aids, GPS Devices, Apps, Accessories & More!
    • Club Making/Fitting/Repair
    • DIY Golf Projects
    • Golf Rules Discussion
  • COURSE REVIEWS + TRAVEL
    • US Courses
    • Outside US Courses
  • The 19TH HOLE
    • (The 19th Hole)
    • Everything Else!
    • The Cigar Lounge
  • FORUM CENTRAL
    • Staff Announcements
    • Suggestion Box
    • Bugs and Site Issues

Categories

  • Articles

Categories

  • New Members
  • Posting Tips
  • Forum Features
  • Testing

Find results in...

Find results that contain...


Date Created

  • Start

    End


Last Updated

  • Start

    End


Filter by number of...

Joined

  • Start

    End


Group


Twitter


Instagram


Location


Interests


Handicap:


Referred By:

Found 1 result

  1. Background After I took a few months off of MGS due to some work and life things that needed my attention, I finally got some time to log back on and post a long overdue review. After posting, I found myself curious as to how I was selected and the likelihood of being selected again, especially since my product was so niche and did not receive many interactions. So I took the challenge upon myself to do a not so scientific study of how reviewers are selected and compare it to the post (https://forum.mygolfspy.com/faq/testing/how-are-equipment-testers-chosen-r11/) by the Mods on how testers are selected. While there are no bombshells, I think we Spys understand the value of data in telling our story. So, I did my best at pulling all data from the past 18 months of reviews Methods First off, I am no data scientist. From DJ to Bryson, I’m somewhere in the middle. I mainly do very basic data analysis in my role through excel. I know, I know, SQL is much more efficient and I’ll get around to learning it. For this study, I focused on 18 months of product reviews. I chose this timeframe to get some data points outside of the pandemic. I started going through the review pages and noticed it was not as organized as I thought, so I chose to do only 18 months of reviews that were already on the “Past Opportunities” page in the dropdown menu. Like I said, I believe that there may have been a few more reviews that have not been archived to this page, but I don’t think it was significant. I then did the manual labor of data entry and collection. I clicked on each individual “TESTERS ANNOUNCED” within the time frame, recorded the following data points - date of review published - product - rough product MSRP - MGS username - If the User has a Donor badge on their profile - Content count - Total reputation - Month and year MGS account was created - City and State Once all of this information was input into the spreadsheet, I then added a note if they had multiple reviews and I also created an additional metric to measure the quality of the posts. All analysis comes from these data points. Limitations Overall, the main limitation is that this dataset was manually curated, so there could be a mistake, but any mistake would be minor. I have checked that there is not an additional 0 added somewhere but there could be +/- 5 units off when it comes to reputation due to fat fingers on a keyboard. Additionally, the MSRP was calculated off a quick Google search. I get that there could be sales, or the price was actually $25 higher at the time of testing. Finally, the content created, and total reputation numbers are not accurate to the time the member was selected. There is no way for me to go back and see this information, but I proceeded with collecting this because for the most part, I believe that most people’s posts grow linearly. So, the quality of posts metric should not be too far off. I will address that in bigger reviews (Cobraconnect), this ratio can get skewed. Enough with the small talk, let’s take a look at the data. Data: Let’s take a look at the data! By the numbers- - 48 total products - 202 unique testers - ~$119,625 total costs of products tested As I said in the introduction, I will be using the MGS selection criteria as a framework of analysis. Picture – I did not go through and collect this data. This is also the easiest of all of them. Put something up there! I’ve seen pictures of the people (I assume), cartoons, their children, memes, etc. Just put something up. (Inclusive, but probably true) Prove review chops - 71 (35%) of people had more than 1 review. If you get chosen, even for the smallest thing, a pushcart distance tracker, a dozen balls, etc. take it seriously and write a good review. This means good pictures, concise statements that clearly articulate your opinion on a product, timely submissions, and good follow ups. A good first review will go a long way to getting your second review and the data clearly backs this up. Additionally, one anecdotal note, it was very common for there to be a progression of items to review, starting with a smaller item and then growing to more expensive items. (True) Start posting – Please remember that this category is not perfect because I was not able to pull the post and reputation numbers at the time the testers were selected. I am assuming that people post at a linear rate. I looked at three areas for this section. Number of posts, Total reputation, and third metric that I created, Quality of post score. I wanted to look at how many interactions per post the reviewer gets. This metric speaks to higher quality of posts for the forum. Total number of posts Total reputation Quality of Post Score* Range 5 - 14,307 3 - 26,497 0.09 - 5.8 Mean 1,412 2866 1.92 Median 513 824 1.85 Because of such a large range with bunching happening at the higher end of each category, I would suggest forum members look at the Median. The best chance for you to be chosen is to have over 500 posts with over 800 total reputation which comes out to 1.85 interactions per post. Like I said earlier, there were a number of members chosen with less than this number but most under 500 posts were in the 100-200 range. I think the biggest takeaway, is to post frequently and create posts that others want to engage in. I think an average user can get to 200+ posts in 2-3 months. There might be an issue that leads to inequity when it comes to being chosen. As I started reviewing some of the posts and interactions, if you are chosen for a more high-profile test, your posts and interactions will increase at a higher rate than other members. This easily leads to a boost in your profile making it easier to be selected. I don’t think there is a way around smoothing out this distinct increase, but it is something that I hope mods take notice of. (True) Become an MGS Donor This category comes with major limitations. The only public piece of data that I was able to collect was if the member included the MGS Donor badge in their profile. From this information I found 59.9% of testers had a donor badge in their profile. I suspect that this number may be closer to 70-80% but these members have not reached out to get the donor badge added. (Inconclusive but probably true) Overall, MGS is clear about how testers are selected. For the most part their criteria is clear and the data shows that they follow the criteria for selection. Common misconceptions I wanted to add a section that debunked some of the common misconceptions that I had when I first joined the forum. - I have to be a member for a long time before I get anything (FALSE) o Over the past 18 months, members who were selected had profiles created from March 2010 – November 2020. 42 testers (21%) joined in 2020. If you’re new and are contributing to the forum, you have a decent chance of being selected. - Once I get one review, I won’t ever get another (FALSE) o 71 (35%) members had more than 1 review in 18 months. I suspect if I go back even further, they had other reviews too. As I said in the above section, if you get chosen and do a good review, you actually put yourself at an advantage to be selected again. o No reviewer had more than 3 reviews in the past 18 months. It appears that this is the limit, it looks to be extremely rare for a member to have more than 1 review every 6 months. - I have to be good at golf to be selected for a review (FALSE) o This idea is very much false, MGS wants members of all skill levels to be involved in testing. In order to prove this, I pulled the handicaps listed on members’ profiles. This is a limitation because it is member provided information that is most likely not updated regularly. Regardless, here is what I found. Range – (33 - +5) Mean – 10.09 Median – 9.2 o If anything, having an accurate handicap is beneficial to being selected because, it appears that MGS wants to have a broad range of testers for each product, specifically balls and clubs. Conclusion For the most part, MGS sticks to what they say when it comes to choosing testers but it appears that there is some non-measurables that they use when it comes to selecting first time testers. It does not appear that there is a formula or that it is completely random. There seems to be a human element which makes it hard to say exactly why specific members were chosen over others from the outside looking in. Shoutout to @Brian A, @jlukes, @B.Boston, @txgolfjunkie, @gavinski91, @Mtbryant01, @daviddvm for having been selected to review over $3,000 worth of equipment! These individuals were ones who had been selected for this year or last year's Cobraconnect challenge. So each of them had at least one product review in addition to cobra connect in the past 18 months. TL;DR - Overall, post quality content frequently, if you are given a chance to test an item, take it seriously, and keep posting. If you’re new and you stay involved, your time will come. Okay, I’m tired. Let me know you’re thoughts!
×
×
  • Create New...