Search Results

Search found 3103 results on 125 pages for 'amazon appstore'.

Page 9/125 | < Previous Page | 5 6 7 8 9 10 11 12 13 14 15 16  | Next Page >

  • How to get the list of price offers on an item from Amazon with python-amazon-product-api item_looku

    - by miernik
    I am trying to write a function to get a list of offers (their prices) for an item based on the ASIN: def price_offers(asin): from amazonproduct import API, ResultPaginator, AWSError from config import AWS_KEY, SECRET_KEY api = API(AWS_KEY, SECRET_KEY, 'de') str_asin = str(asin) node = api.item_lookup(id=str_asin, ResponseGroup='Offers', Condition='All', MerchantId='All') for a in node: print a.Offer.OfferListing.Price.FormattedPrice I am reading http://docs.amazonwebservices.com/AWSECommerceService/latest/DG/index.html?ItemLookup.html and trying to make this work, but all the time it just says: Failure instance: Traceback: <type 'exceptions.AttributeError'>: no such child: {http://webservices.amazon.com/AWSECommerceService/2009-10-01}Offer

    Read the article

  • Delete Amazon S3 buckets?

    - by Kyle Cronin
    I've been interacting with Amazon S3 through S3Fox and I can't seem to delete my buckets. I select a bucket, hit delete, confirm the delete in a popup, and... nothing happens. Is there another tool that I should use?

    Read the article

  • Web based client for Amazon S3

    - by Dick Lebavo
    We are looking for a secure online solution to access our files stored on Amazon S3. We have about 3K files, mostly media and documents, that we need to make available to our employees on the move. We don't want to develop anything in-house if there is an existing solution. Please note that our employees are not technologically minded , so a simple web based upload/download GUI would work the best.

    Read the article

  • Amazon Product Advertising API: XXX is not a valid value for BrowseNodeId

    - by Chris
    Hello, I am using the Amazon product Advertising API to fetch the product categories. For US categories it is working. But using browse nodes from different sites I get the following error: "569604 is not a valid value for BrowseNodeId. Please change this value and retry your request." I got the browse-nodes from the following site: http://docs.amazonwebservices.com/AWSECommerceService/latest/DG/index.html?BrowseNodeIDs.html Where is the problem? Thanks for your help!

    Read the article

  • Can no longer deploy to Amazon AWS using VS 2010

    - by KevinDeus
    Did something change on Amazon recently? I'm trying to redeploy to my Amazon instance, and the "Publish to Amazon Cloudformation" plugin for VS 2010 no longer appears to update my instance. It tells me that upload is successful, but my instance does not appear to be updating on Amazon I've tried disabling all my instances and using the tool to create a new instance , but no luck. I do see that the URL of the deployed application (which looks like this: http://c2-107-20-11-27.compute-1.amazonaws.com) does not appear to match up with the public IP of my instance on Amazon. (even when it creates a new one) This seems to indicate that something might be broken. any clues? (btw, whenever the Amazon VS2010 Plugin creates a new instance, I am sure to reconnect my elastic IP with the new instance)

    Read the article

  • Amazon’s New Kindle Fire Tablet: the How-To Geek Review

    - by The Geek
    We got our Kindle Fire a few days ago, and since then we’ve been poking, prodding, and generally trying to figure out how to break it. Before you go out and buy your own, check out our in-depth review. Note: This review is extremely long, so we’ve split it up between multiple pages. You can use the navigation links or buttons at the bottom to flip between pages. Amazon’s New Kindle Fire Tablet: the How-To Geek Review HTG Explains: How Hackers Take Over Web Sites with SQL Injection / DDoS Use Your Android Phone to Comparison Shop: 4 Scanner Apps Reviewed

    Read the article

  • Amazon EC2 EBS automatic backup one-liner works manually but not from cron

    - by dan
    I am trying to implement an automatic backup system for my EBS on Amazon AWS. When I run this command as ec2-user: /opt/aws/bin/ec2-create-snapshot --region us-east-1 -K /home/ec2-user/pk.pem -C /home/ec2-user/cert.pem -d "vol-******** snapshot" vol-******** everything works fine. But if I add this line into /etc/crontab and restart the crond service: 15 12 * * * ec2-user /opt/aws/bin/ec2-create-snapshot --region us-east-1 -K /home/ec2-user/pk.pem -C /home/ec2-user/cert.pem -d "vol-******** snapshot" vol-******** that doesn't work. I checked var/log/cron and there is this line, therefore the command gets executed: Dec 13 12:15:01 ip-10-204-111-94 CROND[4201]: (ec2-user) CMD (/opt/aws/bin/ec2-create-snapshot --region us-east-1 -K /home/ec2-user/pk.pem -C /home/ec2-user/cert.pem -d "vol-******** snapshot" vol-******** ) Can you please help me to troubleshoot the problem? I guess is some environment problem - maybe the lack of some variable. If that's the case I don't know what to do about it. Thanks.

    Read the article

  • Amazon Product Availability

    - by user201140
    Many products on amazon are unavailable but a date is given for when they'll be in stock. I've used AWS and got an XML reply but I can't find the date information anywhere. Is it possible to get this information? Thanks. NOTE: This is what i've got as a request, what should I alter? http://ecs.amazonaws.com/onca/xml?AWSAccessKeyId=MYID&AssociateTag=MYTAG&ItemId=THEITEMID&Operation=ItemLookup&ResponseGroup=Large&ReviewSort=-HelpfulVotes&Service=AWSECommerceService&Signature=MYSIGNATURE&Timestamp=2009-12-04T17%3A35%3A43Z&Version=2009-06-01

    Read the article

  • Using amazon s3 with cloudfront as a CDN

    - by weezybizzle
    I would like to serve user uploaded content (pictures, videos, and other files) from a CDN. using Amazon S3 with cloudfront seems like a reasonable way to go. My only question is about the speed of the file system. My plan was to host user media with the following uri. cdn.mycompany.com/u/u/i/d/uuid.jpg. I don't haven any prior experience with S3 or CDN's and I was just wondering if this strategy would scale well to handle a large amount of user uploaded content. And if there might be another conventional way to accomplish this.

    Read the article

  • which service to use among Amazon's free tier for Java app

    - by vikas devde
    this is the first time I am going to host a java app, Amazon offers a free tier which provides free usage for one year, and I am going to use it. But there are so many services (S3, EC2, etc etc), I cant figure out which service is for web hosting, which service is for what use, their docs is so huge, I am confused, what to read. can anybody write some good points, about which service to use specifically for Java apps, how much I will be charged, and they ask for credit/debit card credential for signup, does it mean they will debit me even in the free period?

    Read the article

  • increasing amazon root volume size

    - by OCD
    I have a default amazon ec2 instance with 8GB root volume size. I am running out of space. I have: Detach the current EBS volume in AWS Management Console (Web). Create snapshot of this volume. Created a new Volume with 50G space with my snapshot. Attach the new volume back to the instance to /dev/sda1 However, when I reconnect to the account with: > df -h I can see from the management console that my new Filesystem 1K-blocks Used Available Use% Mounted on /dev/xvda1 8256952 8173624 0 100% / tmpfs 308508 40 308468 1% /dev/shm It's still not using my new volume's size, how to make this work?

    Read the article

  • Best practices for using Amazon SQS - Polling the queue

    - by alex
    I'm designing a service for sending out emails for our eCommerce site (order confirmations, alerts etc...) The plan is to have a "SendEmail" method, that generates a chunk of XML representing the email to be sent, and sticks it on an Amazon SQS queue. My web app(s) and other applications will use this to "send" emails. I then require a way of checking the queue, and physically sending out the email messages. (I know how I'm going to be dispatching emails) I'm curious as to what the best way to "poll" the queue would be? Should I create a windows service, and use something like Quartz.net to schedule it to check the queue every x number of minutes for example? Is there a better way of doing this?

    Read the article

  • Problem downloading .exe file from Amazon S3 with a signed URL in IE

    - by Joe Corkery
    I have a large collection of Windows exe files which are being stored/distributed using Amazon S3. We use signed URLs to control access to the files and this works great except in one case when trying to download a .exe file using Internet Explorer (version 8). It works just fine in Firefox. It also works fine if you don't use a signed URL (but that is not an option). What happens is that the IE downloader changes the name from 'myfile.exe' to 'myfile[1]' and Windows no longer recognizes it as an executable. Any advice would greatly be appreciated. Thanks.

    Read the article

  • Amazon-like ecommerce site

    - by Soule
    Hey there, My idea was to make an e-commerce site alot like Amazon. Not exactly cloning it, but since its for a niche market, i need something like it. I was thinking of using Magento or something like that to use it as a base, but I cant figure out how to allow users to: Sign Up for account, get verified by me. Allowed to add items, so they can be searchable. Product Reviews, What can I use to achieve/make this, and what are some suggestions? I can code in PHP and python, thanks!

    Read the article

  • how to 'load data infile' on amazon RDS?

    - by feydr
    not sure if this is a question better suited for serverfault but I've been messing with amazon RDS lately and was having trouble getting 'file' privileges to my web host mysql user. I'd assume that a simple: grant file on *.* to 'webuser@'%'; would work but it does not and I can't seem to do it with my 'root' user as well. What gives? The reason we use load data is because it is super super fast for doing thousands of inserts at once. anyone know how to remedy this or do I need to find a different way? This page, http://docs.amazonwebservices.com/AmazonRDS/latest/DeveloperGuide/index.html?Concepts.DBInstance.html seems to suggest that I need to find a different way around this. Help? UPDATE I'm not trying to import a database -- I just want to use the file load option to insert several hundred-thousand rows at a time. after digging around this is what we have: mysql> grant file on *.* to 'devuser'@'%'; ERROR 1045 (28000): Access denied for user 'root'@'%' (using password: YES) mysql> select User, File_priv, Grant_priv, Super_priv from mysql.user; +----------+-----------+------------+------------+ | User | File_priv | Grant_priv | Super_priv | +----------+-----------+------------+------------+ | rdsadmin | Y | Y | Y | | root | N | Y | N | | devuser | N | N | N | +----------+-----------+------------+------------+

    Read the article

  • How to get Amazon s3 PHP SDK working?

    - by JakeRow123
    I'm trying to set up s3 for the first time and trying to run the sample file that comes with the PHP sdk that creates a bucket and attempts to upload some demo files to it. But this is the error I am getting: The difference between the request time and the current time is too large. I read on another question on SO that this is because Amazon determines a valid request by comparing the times between the server and the client, that the 2 must be within a 15 min span of one another. Now here is the problem. My laptop's time is 12:30AM June 8, 2012 at the moment. On my server I created a file called servertime.php and placed this code in that file: <?php print strftime('%c'); ?> and the output is: Fri Jun 8 00:31:22 2012 It looks like the day is correct but I don't know what to make of 00:31:22. In any case, how is it possible to always make sure the time between the client and server is within a 15 minute window of one another. What if I have a user in China who wishes to upload a file on my site which uses s3 for the cdn. Then the time difference would be over a day. How can I make sure all my user's times are within 15 minutes of my server time? What if the user is in the U.S. but the time on their machine is misconfigured. Basically how to get s3 bucket creation and upload to work?

    Read the article

  • amazon product advertising api - item lookup request working example

    - by I__
    would anyone have a working example of an amazon ITEMLOOKUP ? i have the following code but it does not seem to work: string ISBN = "0393326381"; string ASIN = ""; if (!(string.IsNullOrEmpty(ISBN) && string.IsNullOrEmpty(ASIN))) { AWSECommerceServicePortTypeChannel service = new AWSECommerceServicePortTypeChannel(); ItemLookup lookup = new ItemLookup(); ItemLookupRequest request = new ItemLookupRequest(); lookup.AssociateTag = secretKey; lookup.AWSAccessKeyId = accessKeyId; if (string.IsNullOrEmpty(ASIN)) { request.IdType = ItemLookupRequestIdType.ISBN; request.ItemId = new string[] { ISBN.Replace("-", "") }; } else { request.IdType = ItemLookupRequestIdType.ASIN; request.ItemId = new string[] { ASIN }; } request.ResponseGroup = new string[] { "OfferSummary" }; lookup.Request = new ItemLookupRequest[] { request }; response = service.ItemLookup(lookup); if (response.Items.Length > 0 && response.Items[0].Item.Length > 0) { Item item = response.Items[0].Item[0]; if (item.MediumImage == null) { //bookImageHyperlink.Visible = false; } else { //bookImageHyperlink.ImageUrl = item.MediumImage.URL; } //bookImageHyperlink.NavigateUrl = item.DetailPageURL; //bookTitleHyperlink.Text = item.ItemAttributes.Title; //bookTitleHyperlink.NavigateUrl = item.DetailPageURL; if (item.OfferSummary.LowestNewPrice == null) { if (item.OfferSummary.LowestUsedPrice == null) { //priceHyperlink.Visible = false; } else { //priceHyperlink.Text = string.Format("Buy used {0}", item.OfferSummary.LowestUsedPrice.FormattedPrice); //priceHyperlink.NavigateUrl = item.DetailPageURL; } } else { //priceHyperlink.Text = string.Format("Buy new {0}", item.OfferSummary.LowestNewPrice.FormattedPrice); //priceHyperlink.NavigateUrl = item.DetailPageURL; } if (item.ItemAttributes.Author != null) { //authorLabel.Text = string.Format("By {0}", string.Join(", ", item.ItemAttributes.Author)); } else { //authorLabel.Text = string.Format("By {0}", string.Join(", ", item.ItemAttributes.Creator.Select(c => c.Value).ToArray())); } /* ItemLink link = item.ItemLinks.Where(i => i.Description.Contains("Wishlist")).FirstOrDefault(); if (link == null) { //wishListHyperlink.Visible = false; } else { //wishListHyperlink.NavigateUrl = link.URL; } * */ } } } the problem is with this: thisshould be defined differently but i do not know how AWSECommerceServicePortTypeChannel service = new AWSECommerceServicePortTypeChannel();

    Read the article

  • EC2 persistence of machine

    - by Seagull
    I want to 'persist' my Amazon EC2 images. My scenario: I have a range of Windows and Linux machines Some machines are EBS backed, whereas others are S3 backed. I need to be able to persist a machine (put it to sleep), preferably keeping all settings active I had them when the machine was running. I need to be able to quickly wake up a machine from sleep [Ideally with an SLA of less than 2 min to turn-on, if such an SLA is available with Amazon]. Here's the stuff that confuses me: AWS allows me to put EBS backed machines to sleep, but not S3 backed. I believe I can put S3 machines into some sort of persistence mode. But this involves shutting down the machine, writing it to S3 storage and then recovering from there (not a real sleep mode, but at least I don't continue to get billed for CPU). S3 backing seems to take a long time to either writing a machine to disk, or to recover (turn on a machine). I can't tell immediately which machines are EBS backed and which are S3 backed? It seems like I can instantiate either type, but it's not immediately clear how Amazon decided whether a given machine should be EBS or S3 backed. Advice?

    Read the article

  • Can't Connect to IIS Ftp Site under Amazon EC2

    - by h3n
    IIS 7.5: Ftp Firewall Suport: Data Ranges 49152-65535 using external Ip of Amazon EC2 static IP Ftp IPv4 Restriction: allow: Amazon EC2 static IP Ftp Authentication: Anonymous: Enabled, Basic: Disabled, IISMgr: Enabled Ftp Authorization: Allow All Users: Read/Write Windows Firewall (Inbound): Open port 21 Open port ranges: 49152-65535 (Outbound) Open port: 20 Amazon EC2 Security Group: Custom TCP Rule: 21 Custom TCP Rule: 49152-65535 It works on Internet Explorer when I typed the address: ftp://localhost on the server but when I entered the Amazon EC2 Static IP (ftp://IPADRESS) it doesnt connect. I cant connect also to FileZilla

    Read the article

  • Scripts to parse and download iTunes Connect and AppStore data

    - by bradhouse
    I'm looking for recommendations of a script or series of scripts that download and parse iTunes Connect sales data and AppStore comments, ratings and rankings data for a defined app. I'm also aware of solutions like: AppViz appsales-mobile iphone-stats Heartbeat.app I'm sure I'll find a few more with more searching. I can't help but feel there must be a really decent set of open source scripts out there to do this, given how many developers are now writing apps for the AppStore. Would be interested to hear any commercial offerings as well (although my personal preference is for open source, so I can at least see what it is doing with my iTunes Connect login credentials). To be clear, I'm really looking for something that hits all of the areas mentioned: App Store (per store) Comments Ratings Category/store rankings iTunes Connect The contents of the sales reports Analysis/graphs of the data is not necessary (but would be a nice to have I guess). I'm not really looking for something like AppSales Mobile above, I would like the raw data so I can do my own analysis and formatting. So far it looks like AppViz (listed above) is the best out there. Any suggestions on what is good/available or should I just go roll my own?

    Read the article

  • Why does my custom Amazon EC2 AMI have limited instance type options?

    - by John
    The Basic 64-bit Amazon Linux AMI has the following instance type options available: Micro Large Extra-Large High-Memory Extra Large ... etc I booted up this AMI as a micro type, made customizations, shut it down, detached the volume, took a snapshot, and registered my own custom AMI: ec2-register –snapshot [snapshot_id] –description "my description" –name "my name" –kernel aki-427d952b That worked. HOWEVER, when I try to create an instance from my custom AMI, only the following instance types are available: Micro Small High-CPU Medium ... which coincidentally are the same instance types available if you try to boot up the 32-bit Amazon image. Why are the available instance types of my custom image varying from the available instance types of the image I based it off of?

    Read the article

  • Are whole VM images backed up on Amazon EC2/S3?

    - by John
    I've been trying to get my head around Amazon Web Services as a VPS provider. My understanding is a EC2 instance running Windows is basically a Windows VM, very similar to renting a VPS from a more traditional hosting provider. I don't want to have complex backups, either to administer or to restore - if my restore involves installing SVN, MySQL, Jira, etc on a new box before I can even try to restore the backup then it's not great to me. What I really want is a service which backs up my entire VM... if the PC running the VPS dies then the VM image is installed on a new PC and off we go again. With Amazon being all about flexibility and elasticity, I wondered if they have this service? I can't figure it out from reading their docs.

    Read the article

  • Why do I get "Permission denied (publickey)" when trying to SSH from local Ubuntu to a Amazon EC2 se

    - by Vorleak Chy
    I have an instance of an application running in the cloud on Amazon EC2 instance, and I need to connect it from my local Ubuntu. It works fine on one of local ubuntu and also laptop. I got message "Permission denied (publickey)" when trying to access SSH to EC2 on another local Ubuntu. It's so strange to me. I'm thinking some sort of problems with security settings on the Amazon EC2 which has limited IPs access to one instance or certificate may need to regenerate. Does anyone know a solution?

    Read the article

  • How to Setting up Amazon EC2 with own OS and DB ?

    - by Spencer Lim
    i got my own version of OS and DB which are window server 2008 Hyper-V R2 and Sql server R2 2008 both in enterprise version may i know how to configure it up and running ? with amazon EC2, what other is a must combination to make it run ? also how could i install the operating system and DNS ? i never doing server before, but i just need something like VPS to support my development and testing. Amazon Ec2 seem the best and cheapest service due to only $1 per hour. Appreciate if Any brief guide provided, Thx =D

    Read the article

< Previous Page | 5 6 7 8 9 10 11 12 13 14 15 16  | Next Page >