Jump to content

TBS

Member
  • Posts

    322
  • Joined

  • Last visited

Reputation Activity

  1. Like
    TBS got a reaction from Will A in Titleist Linkmaster bags   
    I've been looking to upgrade my bag and I've decided that I wanted to try a more premium option that is carry friendly but primarily for pushcart or riding. I want a smaller profile bag so the Vessel players bags look a little too big. The Vessel Lite Stand is on my list but it looks like they are discontinuing and launching something new for the 2022 season. I've thought about Stitch, but I really want one more pocket/compartment. If it had the ball compartment on the back, it would be a finalist. 
    I stumbled on the Titleist Linkmaster series when I was on a recent golf trip and I was super impressed. The only downside is that it may be a 1/2 lb heavier than the stitch or vessel but the quality, durability, and customer service of Titleist make up for that. GolfWRX has a few threads on them, but I wanted to see if anyone here had any experience or thoughts on this bag.
  2. Like
    TBS got a reaction from fixyurdivot in Between clubs: hit it hard or take some off?   
    Shotscope had a report that the majority of golfers will always miss the green short. I have always factored that into my club selection. I want a club that I comfortably carry the front of the green. 
  3. Hmmm
    TBS got a reaction from Bang60 in Liv Golf Central Thread: Events and News   
    LIV secures a contract with CW - https://www.espn.com/golf/story/_/id/35480510/liv-golf-secures-tv-streaming-deal-cw-network
    Few takeaways and questions
    - all three days will be on CW app
    - Sat/Sun will be on CW tv station. Saw somewhere on Twitter someone reply that local CW stations might already have programming on those days for their market and it will be up to them to decide to air or not. I'm in the South East and once upon a time mlb aired some on CW. Quick google shows that Mets still have deal with CW https://www.mlb.com/mets/team/broadcast-information. I don't follow MLB so not sure if this actually a problem.
    - LIV doesn't get rights fees and still pays production costs. CW/LIV have a profit sharing agreement.
    - Commercials - I'm assuming that there will now be commercials? I'm very interested to see how this works with the shotgun start and keeping the flow and promise of twice as many shots.
    - No YouTube - Audience is now primarily US and less accessible internationally. I would think they would want the Asian market. Maybe CW app is available there??
    - Nielsen data - Now that its not on Youtube and bots can impact reliability in viewership numbers, I will be very interested what the Nielsen numbers will be. I think this could be a big challenge moving forward.
  4. Like
    TBS got a reaction from funkyjudge in Mandatory tee boxes?   
    Hopping in late here. Few thoughts -
    First, golf is for fun. Play whatever tees you want to have fun. As long as you play fast and respect the course I don't care. There are matchplay courses that have zero tee boxes. Tee it up wherever you want. Ohoopee and another course in the NE (can't remember the name). Tom Coyne visited and had a fun reflection on how he doesn't like it because you can't add your round to GIN but he also likes it because its about fun. 
    Second thought - I've played this course plus a lot of the other courses on the RTJ trail. These courses are designed with clear landing areas. Typically 120-170 yards out. I have played a number of scrambles where I play a tee up. I have found that it makes the course play much harder. Landing areas are much smaller, slopes instead of flat lies, OB, awkward yardages, fairway ends and second shot in rough, etc. if you choose to hit driver. Otherwise you can play hybrids and irons all day. So I do think that there is some thought to be given to course design.
     
    Final thought - why do pros hate a 300 yard par 3 but love 300 yard par 4s if the average score is not too different. 
    Short Par 4s
    Riv hole 10 - avg score 3.88 (2021)
    TPC River Highlands hole 15 - avg score 3.85 (2022)
    Long Par 3s
    Oakmont hole 8 - avg score 3.3 (2016)
    Ocean Course hole 17 - 3.3 (2012)
  5. Like
    TBS got a reaction from EasyPutter in Unofficial Review – How MGS chooses their reviewers, a non-scientific study of 18 months of member reviews.   
    Background 
    After I took a few months off of MGS due to some work and life things that needed my attention, I finally got some time to log back on and post a long overdue review. After posting, I found myself curious as to how I was selected and the likelihood of being selected again, especially since my product was so niche and did not receive many interactions. So I took the challenge upon myself to do a not so scientific study of how reviewers are selected and compare it to the post (https://forum.mygolfspy.com/faq/testing/how-are-equipment-testers-chosen-r11/) by the Mods on how testers are selected. While there are no bombshells, I think we Spys understand the value of data in telling our story. So, I did my best at pulling all data from the past 18 months of reviews
     
    Methods 
    First off, I am no data scientist. From DJ to Bryson, I’m somewhere in the middle. I mainly do very basic data analysis in my role through excel. I know, I know, SQL is much more efficient and I’ll get around to learning it. For this study, I focused on 18 months of product reviews. I chose this timeframe to get some data points outside of the pandemic. I started going through the review pages and noticed it was not as organized as I thought, so I chose to do only 18 months of reviews that were already on the “Past Opportunities” page in the dropdown menu. Like I said, I believe that there may have been a few more reviews that have not been archived to this page, but I don’t think it was significant.
    I then did the manual labor of data entry and collection. I clicked on each individual “TESTERS ANNOUNCED” within the time frame, recorded the following data points 
    -       date of review published
    -       product
    -       rough product MSRP 
    -       MGS username
    -       If the User has a Donor badge on their profile
    -       Content count
    -       Total reputation
    -       Month and year MGS account was created
    -       City and State
    Once all of this information was input into the spreadsheet, I then added a note if they had multiple reviews and I also created an additional metric to measure the quality of the posts. All analysis comes from these data points.
     
    Limitations
    Overall, the main limitation is that this dataset was manually curated, so there could be a mistake, but any mistake would be minor. I have checked that there is not an additional 0 added somewhere but there could be +/- 5 units off when it comes to reputation due to fat fingers on a keyboard. Additionally, the MSRP was calculated off a quick Google search. I get that there could be sales, or the price was actually $25 higher at the time of testing. Finally, the content created, and total reputation numbers are not accurate to the time the member was selected. There is no way for me to go back and see this information, but I proceeded with collecting this because for the most part, I believe that most people’s posts grow linearly. So, the quality of posts metric should not be too far off. I will address that in bigger reviews (Cobraconnect), this ratio can get skewed. Enough with the small talk, let’s take a look at the data.
     
    Data:
    Let’s take a look at the data!
    By the numbers-
    -       48 total products
    -       202 unique testers
    -       ~$119,625 total costs of products tested
     
    As I said in the introduction, I will be using the MGS selection criteria as a framework of analysis.
    Picture – I did not go through and collect this data. This is also the easiest of all of them. Put something up there! I’ve seen pictures of the people (I assume), cartoons, their children, memes, etc. Just put something up. (Inclusive, but probably true)
    Prove review chops - 71 (35%) of people had more than 1 review. If you get chosen, even for the smallest thing, a pushcart distance tracker, a dozen balls, etc. take it seriously and write a good review. This means good pictures, concise statements that clearly articulate your opinion on a product, timely submissions, and good follow ups. A good first review will go a long way to getting your second review and the data clearly backs this up. Additionally, one anecdotal note, it was very common for there to be a progression of items to review, starting with a smaller item and then growing to more expensive items. (True)
    Start posting – Please remember that this category is not perfect because I was not able to pull the post and reputation numbers at the time the testers were selected. I am assuming that people post at a linear rate. I looked at three areas for this section. Number of posts, Total reputation, and third metric that I created, Quality of post score. I wanted to look at how many interactions per post the reviewer gets. This metric speaks to higher quality of posts for the forum.
     
    Total number of posts
    Total reputation
    Quality of Post Score*
    Range
    5 - 14,307
    3 - 26,497
    0.09 - 5.8
    Mean
    1,412
    2866
    1.92
    Median
    513
    824
    1.85
    Because of such a large range with bunching happening at the higher end of each category, I would suggest forum members look at the Median. The best chance for you to be chosen is to have over 500 posts with over 800 total reputation which comes out to 1.85 interactions per post. Like I said earlier, there were a number of members chosen with less than this number but most under 500 posts were in the 100-200 range. I think the biggest takeaway, is to post frequently and create posts that others want to engage in. I think an average user can get to 200+ posts in 2-3 months. 
    There might be an issue that leads to inequity when it comes to being chosen. As I started reviewing some of the posts and interactions, if you are chosen for a more high-profile test, your posts and interactions will increase at a higher rate than other members. This easily leads to a boost in your profile making it easier to be selected. I don’t think there is a way around smoothing out this distinct increase, but it is something that I hope mods take notice of. (True)
     Become an MGS Donor
    This category comes with major limitations. The only public piece of data that I was able to collect was if the member included the MGS Donor badge in their profile. From this information I found 59.9% of testers had a donor badge in their profile. I suspect that this number may be closer to 70-80% but these members have not reached out to get the donor badge added.
    (Inconclusive but probably true)
    Overall, MGS is clear about how testers are selected. For the most part their criteria is clear and the data shows that they follow the criteria for selection. 
    Common misconceptions
    I wanted to add a section that debunked some of the common misconceptions that I had when I first joined the forum. 
    -       I have to be a member for a long time before I get anything (FALSE)
    o   Over the past 18 months, members who were selected had profiles created from March 2010 – November 2020. 42 testers (21%) joined in 2020. If you’re new and are contributing to the forum, you have a decent chance of being selected.
    -       Once I get one review, I won’t ever get another (FALSE)
    o   71 (35%) members had more than 1 review in 18 months. I suspect if I go back even further, they had other reviews too. As I said in the above section, if you get chosen and do a good review, you actually put yourself at an advantage to be selected again.
    o   No reviewer had more than 3 reviews in the past 18 months. It appears that this is the limit, it looks to be extremely rare for a member to have more than 1 review every 6 months. 
    -       I have to be good at golf to be selected for a review (FALSE)
    o   This idea is very much false, MGS wants members of all skill levels to be involved in testing. In order to prove this, I pulled the handicaps listed on members’ profiles. This is a limitation because it is member provided information that is most likely not updated regularly. Regardless, here is what I found.
    Range – (33 - +5) Mean – 10.09 Median – 9.2 o   If anything, having an accurate handicap is beneficial to being selected because, it appears that MGS wants to have a broad range of testers for each product, specifically balls and clubs.
     
    Conclusion
    For the most part, MGS sticks to what they say when it comes to choosing testers but it appears that there is some non-measurables that they use when it comes to selecting first time testers. It does not appear that there is a formula or that it is completely random. There seems to be a human element which makes it hard to say exactly why specific members were chosen over others from the outside looking in. 
    Shoutout to @Brian A, @jlukes, @B.Boston, @txgolfjunkie, @gavinski91, @Mtbryant01, @daviddvm for having been selected to review over $3,000 worth of equipment! These individuals were ones who had been selected for this year or last year's Cobraconnect challenge. So each of them had at least one product review in addition to cobra connect in the past 18 months. 
     
    TL;DR - Overall, post quality content frequently, if you are given a chance to test an item, take it seriously, and keep posting. If you’re new and you stay involved, your time will come. 
     
    Okay, I’m tired. Let me know you’re thoughts!
  6. Like
    TBS reacted to GolfSpy_APH in Unofficial Review – How MGS chooses their reviewers, a non-scientific study of 18 months of member reviews.   
    I'll maybe see if we can link the post to the how to become a tester thread.
  7. Like
    TBS got a reaction from Undershooter in Liv Golf Central Thread: Events and News   
    Looks like LIV might not have the unlimited $$$ that they thought. With more direct oversight on the kingdom's Public Investment Fund, makes me think that any of these ventures that do not show a return won't be around too much longer.
    Read WSJ article - Saudi Crown Prince Tangles with Sovereign Wealth Fund Over How to Invest Oil Riches
    https://www.wsj.com/articles/saudi-arabia-mbs-mohammed-bin-salman-public-investment-fund-11672766494?mod=djem10point
  8. Like
    TBS got a reaction from GolfSpy_KFT in Unofficial Review – How MGS chooses their reviewers, a non-scientific study of 18 months of member reviews.   
    For sure! We appreciate all that you do and although I love me some juicy LIV golf discussion, I want to see that 2023 has some very lively equipment and "regular golf" discussions. 
    With a job change and 1 year old I don't post as much but I've always enjoyed the community here and the more tests the more energy the forum has.
  9. Like
    TBS got a reaction from GolfSpy_KFT in Unofficial Review – How MGS chooses their reviewers, a non-scientific study of 18 months of member reviews.   
    Background 
    After I took a few months off of MGS due to some work and life things that needed my attention, I finally got some time to log back on and post a long overdue review. After posting, I found myself curious as to how I was selected and the likelihood of being selected again, especially since my product was so niche and did not receive many interactions. So I took the challenge upon myself to do a not so scientific study of how reviewers are selected and compare it to the post (https://forum.mygolfspy.com/faq/testing/how-are-equipment-testers-chosen-r11/) by the Mods on how testers are selected. While there are no bombshells, I think we Spys understand the value of data in telling our story. So, I did my best at pulling all data from the past 18 months of reviews
     
    Methods 
    First off, I am no data scientist. From DJ to Bryson, I’m somewhere in the middle. I mainly do very basic data analysis in my role through excel. I know, I know, SQL is much more efficient and I’ll get around to learning it. For this study, I focused on 18 months of product reviews. I chose this timeframe to get some data points outside of the pandemic. I started going through the review pages and noticed it was not as organized as I thought, so I chose to do only 18 months of reviews that were already on the “Past Opportunities” page in the dropdown menu. Like I said, I believe that there may have been a few more reviews that have not been archived to this page, but I don’t think it was significant.
    I then did the manual labor of data entry and collection. I clicked on each individual “TESTERS ANNOUNCED” within the time frame, recorded the following data points 
    -       date of review published
    -       product
    -       rough product MSRP 
    -       MGS username
    -       If the User has a Donor badge on their profile
    -       Content count
    -       Total reputation
    -       Month and year MGS account was created
    -       City and State
    Once all of this information was input into the spreadsheet, I then added a note if they had multiple reviews and I also created an additional metric to measure the quality of the posts. All analysis comes from these data points.
     
    Limitations
    Overall, the main limitation is that this dataset was manually curated, so there could be a mistake, but any mistake would be minor. I have checked that there is not an additional 0 added somewhere but there could be +/- 5 units off when it comes to reputation due to fat fingers on a keyboard. Additionally, the MSRP was calculated off a quick Google search. I get that there could be sales, or the price was actually $25 higher at the time of testing. Finally, the content created, and total reputation numbers are not accurate to the time the member was selected. There is no way for me to go back and see this information, but I proceeded with collecting this because for the most part, I believe that most people’s posts grow linearly. So, the quality of posts metric should not be too far off. I will address that in bigger reviews (Cobraconnect), this ratio can get skewed. Enough with the small talk, let’s take a look at the data.
     
    Data:
    Let’s take a look at the data!
    By the numbers-
    -       48 total products
    -       202 unique testers
    -       ~$119,625 total costs of products tested
     
    As I said in the introduction, I will be using the MGS selection criteria as a framework of analysis.
    Picture – I did not go through and collect this data. This is also the easiest of all of them. Put something up there! I’ve seen pictures of the people (I assume), cartoons, their children, memes, etc. Just put something up. (Inclusive, but probably true)
    Prove review chops - 71 (35%) of people had more than 1 review. If you get chosen, even for the smallest thing, a pushcart distance tracker, a dozen balls, etc. take it seriously and write a good review. This means good pictures, concise statements that clearly articulate your opinion on a product, timely submissions, and good follow ups. A good first review will go a long way to getting your second review and the data clearly backs this up. Additionally, one anecdotal note, it was very common for there to be a progression of items to review, starting with a smaller item and then growing to more expensive items. (True)
    Start posting – Please remember that this category is not perfect because I was not able to pull the post and reputation numbers at the time the testers were selected. I am assuming that people post at a linear rate. I looked at three areas for this section. Number of posts, Total reputation, and third metric that I created, Quality of post score. I wanted to look at how many interactions per post the reviewer gets. This metric speaks to higher quality of posts for the forum.
     
    Total number of posts
    Total reputation
    Quality of Post Score*
    Range
    5 - 14,307
    3 - 26,497
    0.09 - 5.8
    Mean
    1,412
    2866
    1.92
    Median
    513
    824
    1.85
    Because of such a large range with bunching happening at the higher end of each category, I would suggest forum members look at the Median. The best chance for you to be chosen is to have over 500 posts with over 800 total reputation which comes out to 1.85 interactions per post. Like I said earlier, there were a number of members chosen with less than this number but most under 500 posts were in the 100-200 range. I think the biggest takeaway, is to post frequently and create posts that others want to engage in. I think an average user can get to 200+ posts in 2-3 months. 
    There might be an issue that leads to inequity when it comes to being chosen. As I started reviewing some of the posts and interactions, if you are chosen for a more high-profile test, your posts and interactions will increase at a higher rate than other members. This easily leads to a boost in your profile making it easier to be selected. I don’t think there is a way around smoothing out this distinct increase, but it is something that I hope mods take notice of. (True)
     Become an MGS Donor
    This category comes with major limitations. The only public piece of data that I was able to collect was if the member included the MGS Donor badge in their profile. From this information I found 59.9% of testers had a donor badge in their profile. I suspect that this number may be closer to 70-80% but these members have not reached out to get the donor badge added.
    (Inconclusive but probably true)
    Overall, MGS is clear about how testers are selected. For the most part their criteria is clear and the data shows that they follow the criteria for selection. 
    Common misconceptions
    I wanted to add a section that debunked some of the common misconceptions that I had when I first joined the forum. 
    -       I have to be a member for a long time before I get anything (FALSE)
    o   Over the past 18 months, members who were selected had profiles created from March 2010 – November 2020. 42 testers (21%) joined in 2020. If you’re new and are contributing to the forum, you have a decent chance of being selected.
    -       Once I get one review, I won’t ever get another (FALSE)
    o   71 (35%) members had more than 1 review in 18 months. I suspect if I go back even further, they had other reviews too. As I said in the above section, if you get chosen and do a good review, you actually put yourself at an advantage to be selected again.
    o   No reviewer had more than 3 reviews in the past 18 months. It appears that this is the limit, it looks to be extremely rare for a member to have more than 1 review every 6 months. 
    -       I have to be good at golf to be selected for a review (FALSE)
    o   This idea is very much false, MGS wants members of all skill levels to be involved in testing. In order to prove this, I pulled the handicaps listed on members’ profiles. This is a limitation because it is member provided information that is most likely not updated regularly. Regardless, here is what I found.
    Range – (33 - +5) Mean – 10.09 Median – 9.2 o   If anything, having an accurate handicap is beneficial to being selected because, it appears that MGS wants to have a broad range of testers for each product, specifically balls and clubs.
     
    Conclusion
    For the most part, MGS sticks to what they say when it comes to choosing testers but it appears that there is some non-measurables that they use when it comes to selecting first time testers. It does not appear that there is a formula or that it is completely random. There seems to be a human element which makes it hard to say exactly why specific members were chosen over others from the outside looking in. 
    Shoutout to @Brian A, @jlukes, @B.Boston, @txgolfjunkie, @gavinski91, @Mtbryant01, @daviddvm for having been selected to review over $3,000 worth of equipment! These individuals were ones who had been selected for this year or last year's Cobraconnect challenge. So each of them had at least one product review in addition to cobra connect in the past 18 months. 
     
    TL;DR - Overall, post quality content frequently, if you are given a chance to test an item, take it seriously, and keep posting. If you’re new and you stay involved, your time will come. 
     
    Okay, I’m tired. Let me know you’re thoughts!
  10. Like
    TBS got a reaction from GolfSpy_SHARK in Unofficial Review – How MGS chooses their reviewers, a non-scientific study of 18 months of member reviews.   
    For sure! We appreciate all that you do and although I love me some juicy LIV golf discussion, I want to see that 2023 has some very lively equipment and "regular golf" discussions. 
    With a job change and 1 year old I don't post as much but I've always enjoyed the community here and the more tests the more energy the forum has.
  11. Like
    TBS got a reaction from cksurfdude in Unofficial Review – How MGS chooses their reviewers, a non-scientific study of 18 months of member reviews.   
    For sure! We appreciate all that you do and although I love me some juicy LIV golf discussion, I want to see that 2023 has some very lively equipment and "regular golf" discussions. 
    With a job change and 1 year old I don't post as much but I've always enjoyed the community here and the more tests the more energy the forum has.
  12. Like
    TBS got a reaction from cksurfdude in Unofficial Review – How MGS chooses their reviewers, a non-scientific study of 18 months of member reviews.   
    Go ahead and share... Isn't MGS about providing "the truth and nothing but" and shouldn't their forum fall under the same mission?
  13. Like
    TBS got a reaction from cksurfdude in Unofficial Review – How MGS chooses their reviewers, a non-scientific study of 18 months of member reviews.   
    Yes - That should be added somewhere. In all of those cases, these "special" selections were clearly stated in the Tester announcement if I remember correctly. 
  14. Like
    TBS got a reaction from cksurfdude in Unofficial Review – How MGS chooses their reviewers, a non-scientific study of 18 months of member reviews.   
    Background 
    After I took a few months off of MGS due to some work and life things that needed my attention, I finally got some time to log back on and post a long overdue review. After posting, I found myself curious as to how I was selected and the likelihood of being selected again, especially since my product was so niche and did not receive many interactions. So I took the challenge upon myself to do a not so scientific study of how reviewers are selected and compare it to the post (https://forum.mygolfspy.com/faq/testing/how-are-equipment-testers-chosen-r11/) by the Mods on how testers are selected. While there are no bombshells, I think we Spys understand the value of data in telling our story. So, I did my best at pulling all data from the past 18 months of reviews
     
    Methods 
    First off, I am no data scientist. From DJ to Bryson, I’m somewhere in the middle. I mainly do very basic data analysis in my role through excel. I know, I know, SQL is much more efficient and I’ll get around to learning it. For this study, I focused on 18 months of product reviews. I chose this timeframe to get some data points outside of the pandemic. I started going through the review pages and noticed it was not as organized as I thought, so I chose to do only 18 months of reviews that were already on the “Past Opportunities” page in the dropdown menu. Like I said, I believe that there may have been a few more reviews that have not been archived to this page, but I don’t think it was significant.
    I then did the manual labor of data entry and collection. I clicked on each individual “TESTERS ANNOUNCED” within the time frame, recorded the following data points 
    -       date of review published
    -       product
    -       rough product MSRP 
    -       MGS username
    -       If the User has a Donor badge on their profile
    -       Content count
    -       Total reputation
    -       Month and year MGS account was created
    -       City and State
    Once all of this information was input into the spreadsheet, I then added a note if they had multiple reviews and I also created an additional metric to measure the quality of the posts. All analysis comes from these data points.
     
    Limitations
    Overall, the main limitation is that this dataset was manually curated, so there could be a mistake, but any mistake would be minor. I have checked that there is not an additional 0 added somewhere but there could be +/- 5 units off when it comes to reputation due to fat fingers on a keyboard. Additionally, the MSRP was calculated off a quick Google search. I get that there could be sales, or the price was actually $25 higher at the time of testing. Finally, the content created, and total reputation numbers are not accurate to the time the member was selected. There is no way for me to go back and see this information, but I proceeded with collecting this because for the most part, I believe that most people’s posts grow linearly. So, the quality of posts metric should not be too far off. I will address that in bigger reviews (Cobraconnect), this ratio can get skewed. Enough with the small talk, let’s take a look at the data.
     
    Data:
    Let’s take a look at the data!
    By the numbers-
    -       48 total products
    -       202 unique testers
    -       ~$119,625 total costs of products tested
     
    As I said in the introduction, I will be using the MGS selection criteria as a framework of analysis.
    Picture – I did not go through and collect this data. This is also the easiest of all of them. Put something up there! I’ve seen pictures of the people (I assume), cartoons, their children, memes, etc. Just put something up. (Inclusive, but probably true)
    Prove review chops - 71 (35%) of people had more than 1 review. If you get chosen, even for the smallest thing, a pushcart distance tracker, a dozen balls, etc. take it seriously and write a good review. This means good pictures, concise statements that clearly articulate your opinion on a product, timely submissions, and good follow ups. A good first review will go a long way to getting your second review and the data clearly backs this up. Additionally, one anecdotal note, it was very common for there to be a progression of items to review, starting with a smaller item and then growing to more expensive items. (True)
    Start posting – Please remember that this category is not perfect because I was not able to pull the post and reputation numbers at the time the testers were selected. I am assuming that people post at a linear rate. I looked at three areas for this section. Number of posts, Total reputation, and third metric that I created, Quality of post score. I wanted to look at how many interactions per post the reviewer gets. This metric speaks to higher quality of posts for the forum.
     
    Total number of posts
    Total reputation
    Quality of Post Score*
    Range
    5 - 14,307
    3 - 26,497
    0.09 - 5.8
    Mean
    1,412
    2866
    1.92
    Median
    513
    824
    1.85
    Because of such a large range with bunching happening at the higher end of each category, I would suggest forum members look at the Median. The best chance for you to be chosen is to have over 500 posts with over 800 total reputation which comes out to 1.85 interactions per post. Like I said earlier, there were a number of members chosen with less than this number but most under 500 posts were in the 100-200 range. I think the biggest takeaway, is to post frequently and create posts that others want to engage in. I think an average user can get to 200+ posts in 2-3 months. 
    There might be an issue that leads to inequity when it comes to being chosen. As I started reviewing some of the posts and interactions, if you are chosen for a more high-profile test, your posts and interactions will increase at a higher rate than other members. This easily leads to a boost in your profile making it easier to be selected. I don’t think there is a way around smoothing out this distinct increase, but it is something that I hope mods take notice of. (True)
     Become an MGS Donor
    This category comes with major limitations. The only public piece of data that I was able to collect was if the member included the MGS Donor badge in their profile. From this information I found 59.9% of testers had a donor badge in their profile. I suspect that this number may be closer to 70-80% but these members have not reached out to get the donor badge added.
    (Inconclusive but probably true)
    Overall, MGS is clear about how testers are selected. For the most part their criteria is clear and the data shows that they follow the criteria for selection. 
    Common misconceptions
    I wanted to add a section that debunked some of the common misconceptions that I had when I first joined the forum. 
    -       I have to be a member for a long time before I get anything (FALSE)
    o   Over the past 18 months, members who were selected had profiles created from March 2010 – November 2020. 42 testers (21%) joined in 2020. If you’re new and are contributing to the forum, you have a decent chance of being selected.
    -       Once I get one review, I won’t ever get another (FALSE)
    o   71 (35%) members had more than 1 review in 18 months. I suspect if I go back even further, they had other reviews too. As I said in the above section, if you get chosen and do a good review, you actually put yourself at an advantage to be selected again.
    o   No reviewer had more than 3 reviews in the past 18 months. It appears that this is the limit, it looks to be extremely rare for a member to have more than 1 review every 6 months. 
    -       I have to be good at golf to be selected for a review (FALSE)
    o   This idea is very much false, MGS wants members of all skill levels to be involved in testing. In order to prove this, I pulled the handicaps listed on members’ profiles. This is a limitation because it is member provided information that is most likely not updated regularly. Regardless, here is what I found.
    Range – (33 - +5) Mean – 10.09 Median – 9.2 o   If anything, having an accurate handicap is beneficial to being selected because, it appears that MGS wants to have a broad range of testers for each product, specifically balls and clubs.
     
    Conclusion
    For the most part, MGS sticks to what they say when it comes to choosing testers but it appears that there is some non-measurables that they use when it comes to selecting first time testers. It does not appear that there is a formula or that it is completely random. There seems to be a human element which makes it hard to say exactly why specific members were chosen over others from the outside looking in. 
    Shoutout to @Brian A, @jlukes, @B.Boston, @txgolfjunkie, @gavinski91, @Mtbryant01, @daviddvm for having been selected to review over $3,000 worth of equipment! These individuals were ones who had been selected for this year or last year's Cobraconnect challenge. So each of them had at least one product review in addition to cobra connect in the past 18 months. 
     
    TL;DR - Overall, post quality content frequently, if you are given a chance to test an item, take it seriously, and keep posting. If you’re new and you stay involved, your time will come. 
     
    Okay, I’m tired. Let me know you’re thoughts!
  15. Like
    TBS reacted to GolfSpy_APH in Liv Golf Central Thread: Events and News   
    At this point really feel bad for the guys having to constantly say no, I'm staying just because Liv needs more hype and isn't getting it.
    https://www.golfdigest.com/story/patrick-cantlay-shuts-down-rumors-about-going-to-liv-golf
  16. Like
    TBS got a reaction from GolfSpy_APH in Liv Golf Central Thread: Events and News   
    Looks like LIV might not have the unlimited $$$ that they thought. With more direct oversight on the kingdom's Public Investment Fund, makes me think that any of these ventures that do not show a return won't be around too much longer.
    Read WSJ article - Saudi Crown Prince Tangles with Sovereign Wealth Fund Over How to Invest Oil Riches
    https://www.wsj.com/articles/saudi-arabia-mbs-mohammed-bin-salman-public-investment-fund-11672766494?mod=djem10point
  17. Like
    TBS reacted to DaveP043 in Liv Golf Central Thread: Events and News   
    I don't think the Masters pays much attention to pressure from the Tour, or from anyone else.  I really think that it will be a diminishing issue.  LIV will eventually get world ranking points, but with no cut 54 hole events, and less than outstanding strength of field, points will be pretty limited. Most LIV players will drop out of the "automatic invite" rankings, and the Masters won't need to change a thing. 
  18. Like
    TBS reacted to GolfSpy_APH in Liv Golf Central Thread: Events and News   
    Even if they get through a full season next year which I expect they will and future seasons, it will always be a risk of the plug being pulled and it just being left completely. 
  19. Like
    TBS got a reaction from AndySP in Liv Golf Central Thread: Events and News   
    https://www.nytimes.com/2022/12/11/sports/golf/liv-saudi-pga.html
    Much like the Saudis other ventures, its becoming more evident that if this doesn't take off soon they will drop the project.  They already don't have the best record finishing big projects. See Jeddah tower. 
  20. Like
    TBS reacted to cksurfdude in Shot Scope Users Group   
    Yes you added it correctly.
    Using an extra tag for the old driver is helpful if you want to take them both out at the same time, and hit two balls side by side to later compare your on-course results.
     
  21. Like
    TBS got a reaction from tony@CIC in Shot Scope Users Group   
    Question for the group - 
    I just got a new driver. I added the club to "My Bag" on my dashboard and switched the D tag from the old to the new and updated "my Bag" to show D tag is now on new club and old club is not active.
    Is this the best way to add a new club? Will I be able to see and compare data from old driver to new driver this way? Or should I use one of the extra tags on my new driver?
  22. Like
    TBS got a reaction from cksurfdude in Shot Scope Users Group   
    Question for the group - 
    I just got a new driver. I added the club to "My Bag" on my dashboard and switched the D tag from the old to the new and updated "my Bag" to show D tag is now on new club and old club is not active.
    Is this the best way to add a new club? Will I be able to see and compare data from old driver to new driver this way? Or should I use one of the extra tags on my new driver?
  23. Like
    TBS reacted to cnosil in Shot Scope Users Group   
    I believe when you do this the new driver shows up in the stats with a different color. I was hoping to do this with putters, but is doesn’t appear to work for that club.  
  24. Like
    TBS got a reaction from Bobbers in Unofficial Review – How MGS chooses their reviewers, a non-scientific study of 18 months of member reviews.   
    For sure! We appreciate all that you do and although I love me some juicy LIV golf discussion, I want to see that 2023 has some very lively equipment and "regular golf" discussions. 
    With a job change and 1 year old I don't post as much but I've always enjoyed the community here and the more tests the more energy the forum has.
  25. Like
    TBS got a reaction from edingc in Unofficial Review – How MGS chooses their reviewers, a non-scientific study of 18 months of member reviews.   
    For sure! We appreciate all that you do and although I love me some juicy LIV golf discussion, I want to see that 2023 has some very lively equipment and "regular golf" discussions. 
    With a job change and 1 year old I don't post as much but I've always enjoyed the community here and the more tests the more energy the forum has.
×
×
  • Create New...