Join Free
+ Reply to Thread
Page 3 of 7 FirstFirst 1 2 3 4 5 ... LastLast
Results 21 to 30 of 62
  1. #21
    Just wonder what the "Page" settings is all about?
    Also wonder if there is some way to download more than 1000 rows per export?

  2. #22
    Pages allow you to split your result set up into blocks. You can set the page limit to 100 with a 1000 record maximum and get back 10 pages of results for example. 1000 is the maximum you can export with a single query.

    Its a search engine, so each result set is calculated as a fixed number. Take Google for example. searching for the word "the" returns 9.8 billion results, but you can only ever see the first 1,000. that's because they only calculate weights for the first 1000 results found, otherwise search would be slow, and they wouldn't have enough servers to handle the queries

  3. #23
    Today I just noticed that the affiliate_url includes my userid but the API KEY number is totally different from my API key:

    http://prosperent.com/store/product/Myuserid-8620-0/?k=Anuschka+Handbags+-+460+%28Jaipur+Paisley%29+-+Bags+and+Luggage&m=Zappos.com&b=Anuschka+Handbags &p=STRANGEAPIKEY&sid=handbags

    Something wrong here?
    Last edited by it2pro; 11-30-2013 at 07:10 PM. Reason: URL

  4. #24
    This is not an api key. That is the catalogId in our system for that specific product at that specific merchant.

  5. #25
    Ok, great to know. But why do I have to set the api key then in the product settings for the csv export?

  6. #26

  7. #27
    Well that is what I though but wouldn't be more logic to add the userid directly instead? It's a bit confusing to add the API KEY and then not seeing it into the created links. Maybe it's just me .

  8. #28
    Quote Originally Posted by it2pro View Post
    Well that is what I though but wouldn't be more logic to add the userid directly instead? It's a bit confusing to add the API KEY and then not seeing it into the created links. Maybe it's just me .
    Yep! It's just you : )

  9. #29
    Is there any way to scrap the datafeed in loop using http in windows. I tried the wget it saves the file is in junk format and file name is same as URL.

  10. #30
    Quote Originally Posted by ranjancom2000 View Post
    I tried the wget it saves the file is in junk format and file name is same as URL.
    You did it wrong. So I doubt you will get any solution in Windows to work as well

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
coupons | coupons and deals