Search Results

Search found 2978 results on 120 pages for 'amazon aws'.

Page 13/120 | < Previous Page | 9 10 11 12 13 14 15 16 17 18 19 20  | Next Page >

  • Amazon’s New Kindle Fire Tablet: the How-To Geek Review

    - by The Geek
    We got our Kindle Fire a few days ago, and since then we’ve been poking, prodding, and generally trying to figure out how to break it. Before you go out and buy your own, check out our in-depth review. Note: This review is extremely long, so we’ve split it up between multiple pages. You can use the navigation links or buttons at the bottom to flip between pages. Amazon’s New Kindle Fire Tablet: the How-To Geek Review HTG Explains: How Hackers Take Over Web Sites with SQL Injection / DDoS Use Your Android Phone to Comparison Shop: 4 Scanner Apps Reviewed

    Read the article

  • Amazon AWS s3fs mount problem on Fedora 14

    - by Alex
    I successfully compiled and installed s3fs (http://code.google.com/p/s3fs/) on my Fedora 14 machine. I included the password credentials in /etc/ as specified in the guide. When I run: sudo /usr/bin/s3fs bucket_name /mnt/bucket_name/ it runs successfully. (note: the bucket name is the same as the folder name in /mnt/). When I run ls in /mnt/ I get the error "ls: cannot access bucket_name: Permission denied". When I run sudo chmod 640 /mnt/bucket_name I get "chmod: changing permissions of `bucket_name': Input/output error". When I reboot the machine I can access the folder /mnt/bucket_name normally but it is not mapped to the s3 bucket. So, basically I have two questions. 1) How do I access the folder (/mnt/bucket_name) as usual after I mount it to the s3 bucket and 2) How can I keep it mounted even after machine restart. Regards

    Read the article

  • Using amazon s3 with cloudfront as a CDN

    - by weezybizzle
    I would like to serve user uploaded content (pictures, videos, and other files) from a CDN. using Amazon S3 with cloudfront seems like a reasonable way to go. My only question is about the speed of the file system. My plan was to host user media with the following uri. cdn.mycompany.com/u/u/i/d/uuid.jpg. I don't haven any prior experience with S3 or CDN's and I was just wondering if this strategy would scale well to handle a large amount of user uploaded content. And if there might be another conventional way to accomplish this.

    Read the article

  • Best practices for using Amazon SQS - Polling the queue

    - by alex
    I'm designing a service for sending out emails for our eCommerce site (order confirmations, alerts etc...) The plan is to have a "SendEmail" method, that generates a chunk of XML representing the email to be sent, and sticks it on an Amazon SQS queue. My web app(s) and other applications will use this to "send" emails. I then require a way of checking the queue, and physically sending out the email messages. (I know how I'm going to be dispatching emails) I'm curious as to what the best way to "poll" the queue would be? Should I create a windows service, and use something like Quartz.net to schedule it to check the queue every x number of minutes for example? Is there a better way of doing this?

    Read the article

  • Problem downloading .exe file from Amazon S3 with a signed URL in IE

    - by Joe Corkery
    I have a large collection of Windows exe files which are being stored/distributed using Amazon S3. We use signed URLs to control access to the files and this works great except in one case when trying to download a .exe file using Internet Explorer (version 8). It works just fine in Firefox. It also works fine if you don't use a signed URL (but that is not an option). What happens is that the IE downloader changes the name from 'myfile.exe' to 'myfile[1]' and Windows no longer recognizes it as an executable. Any advice would greatly be appreciated. Thanks.

    Read the article

  • Amazon-like ecommerce site

    - by Soule
    Hey there, My idea was to make an e-commerce site alot like Amazon. Not exactly cloning it, but since its for a niche market, i need something like it. I was thinking of using Magento or something like that to use it as a base, but I cant figure out how to allow users to: Sign Up for account, get verified by me. Allowed to add items, so they can be searchable. Product Reviews, What can I use to achieve/make this, and what are some suggestions? I can code in PHP and python, thanks!

    Read the article

  • Using AWS S3 for photo storage

    - by Sam
    I'm going to be using S3 to store user uploaded photos. Obviously, I wont be serving the image files to user agents without resizing them down. However, not one size would do, as some thumbnails will be smaller than other larger previews. So, I was thinking of making a standard set of dimensions scaling from the lowest 16x16 to some highest 1024x1024. Is this a good way to solve this problem? What if I need a new size later on? How would you solve this?

    Read the article

  • What are the steps needed to set up and use security for AWS command line tools?

    - by chris
    I've been trying to set up the AWS command-line tools following Eric's most useful guide at http://alestic.com/2012/09/aws-command-line-tools. I can't seem to find a good how-to for how to generate the x509 certificate and private key, and how that relates to the various security files the guide creates. Update: I have found a couple of links that describe the some steps. These steps seem to work, however I'm not sure if this is secure & the best way to do it: 1) Create a private key openssl genrsa -out my-private-key.pem 2048 2) Create x.509 cert openssl req -new -x509 -key my-private-key.pem -out my-x509-cert.pem -days 365 Hit enter to accept all of the defaults. Then, from the IAM Dashboard, User, select a user & click on the "Security Credentials" tab. Click on "Manage Signing Certificates", then "Upload Signing Certificate", paste in the contents of my-x509-cert.pem, click OK and it should be accepted. One step that is discussed, but not required for me, was the addition and subsequent removal of a pass phrase on the private key. Should I have been prompted for one, and is my cert potentially unsafe because of this?

    Read the article

  • AWS VPC - why have a private subnet at all?

    - by jkim
    In Amazon VPC, the VPC creation wizard allows one to create a single "public subnet" or have the wizard create a "public subnet" and a "private subnet". Initially, the public and private subnet option seemed good for security reasons, allowing webservers to be put in the public subnet and database servers to go in the private subnet. But I've since learned that EC2 instances in the public subnet are not reachable from the Internet unless you associate an Amazon ElasticIP with the EC2 instance. So it seems with just a single public subnet configuration, one could just opt to not associate an ElasticIP with the database servers and end up with the same sort of security. Can anyone explain the advantages of a public + private subnet configuration? Are the advantages of this config more to do with auto-scaling, or is it actually less secure to have a single public subnet?

    Read the article

  • How to reduce memory consumption an AWS EC2 t1.micro instance (free tier) ubuntu server 14.04 LTS EBS

    - by CMPSoares
    Hi I'm working on my bachelor thesis and for that I need to host a node.js web application on AWS, in order to avoid costs I'm using a t1.micro instance with 30GB disk space (from what I know it's the maximum I get in the free tier) which is barely used. But instead I have problems with memory consumption, it's using all of it. I tried the approach of creating a virtual swap area as mentioned at Why don't EC2 ubuntu images have swap? with these commands: sudo dd if=/dev/zero of=/var/swapfile bs=1M count=2048 && sudo chmod 600 /var/swapfile && sudo mkswap /var/swapfile && echo /var/swapfile none swap defaults 0 0 | sudo tee -a /etc/fstab && sudo swapon -a But this swap area isn't used somehow. Is something missing in this approach or is there another process of reducing the memory consumption in these type of AWS instances? Bottom-line: This originates server freezes and crashes and that's what I want to stop either by using the swap, reducing memory usage or both.

    Read the article

  • how to 'load data infile' on amazon RDS?

    - by feydr
    not sure if this is a question better suited for serverfault but I've been messing with amazon RDS lately and was having trouble getting 'file' privileges to my web host mysql user. I'd assume that a simple: grant file on *.* to 'webuser@'%'; would work but it does not and I can't seem to do it with my 'root' user as well. What gives? The reason we use load data is because it is super super fast for doing thousands of inserts at once. anyone know how to remedy this or do I need to find a different way? This page, http://docs.amazonwebservices.com/AmazonRDS/latest/DeveloperGuide/index.html?Concepts.DBInstance.html seems to suggest that I need to find a different way around this. Help? UPDATE I'm not trying to import a database -- I just want to use the file load option to insert several hundred-thousand rows at a time. after digging around this is what we have: mysql> grant file on *.* to 'devuser'@'%'; ERROR 1045 (28000): Access denied for user 'root'@'%' (using password: YES) mysql> select User, File_priv, Grant_priv, Super_priv from mysql.user; +----------+-----------+------------+------------+ | User | File_priv | Grant_priv | Super_priv | +----------+-----------+------------+------------+ | rdsadmin | Y | Y | Y | | root | N | Y | N | | devuser | N | N | N | +----------+-----------+------------+------------+

    Read the article

  • How to get Amazon s3 PHP SDK working?

    - by JakeRow123
    I'm trying to set up s3 for the first time and trying to run the sample file that comes with the PHP sdk that creates a bucket and attempts to upload some demo files to it. But this is the error I am getting: The difference between the request time and the current time is too large. I read on another question on SO that this is because Amazon determines a valid request by comparing the times between the server and the client, that the 2 must be within a 15 min span of one another. Now here is the problem. My laptop's time is 12:30AM June 8, 2012 at the moment. On my server I created a file called servertime.php and placed this code in that file: <?php print strftime('%c'); ?> and the output is: Fri Jun 8 00:31:22 2012 It looks like the day is correct but I don't know what to make of 00:31:22. In any case, how is it possible to always make sure the time between the client and server is within a 15 minute window of one another. What if I have a user in China who wishes to upload a file on my site which uses s3 for the cdn. Then the time difference would be over a day. How can I make sure all my user's times are within 15 minutes of my server time? What if the user is in the U.S. but the time on their machine is misconfigured. Basically how to get s3 bucket creation and upload to work?

    Read the article

  • amazon product advertising api - item lookup request working example

    - by I__
    would anyone have a working example of an amazon ITEMLOOKUP ? i have the following code but it does not seem to work: string ISBN = "0393326381"; string ASIN = ""; if (!(string.IsNullOrEmpty(ISBN) && string.IsNullOrEmpty(ASIN))) { AWSECommerceServicePortTypeChannel service = new AWSECommerceServicePortTypeChannel(); ItemLookup lookup = new ItemLookup(); ItemLookupRequest request = new ItemLookupRequest(); lookup.AssociateTag = secretKey; lookup.AWSAccessKeyId = accessKeyId; if (string.IsNullOrEmpty(ASIN)) { request.IdType = ItemLookupRequestIdType.ISBN; request.ItemId = new string[] { ISBN.Replace("-", "") }; } else { request.IdType = ItemLookupRequestIdType.ASIN; request.ItemId = new string[] { ASIN }; } request.ResponseGroup = new string[] { "OfferSummary" }; lookup.Request = new ItemLookupRequest[] { request }; response = service.ItemLookup(lookup); if (response.Items.Length > 0 && response.Items[0].Item.Length > 0) { Item item = response.Items[0].Item[0]; if (item.MediumImage == null) { //bookImageHyperlink.Visible = false; } else { //bookImageHyperlink.ImageUrl = item.MediumImage.URL; } //bookImageHyperlink.NavigateUrl = item.DetailPageURL; //bookTitleHyperlink.Text = item.ItemAttributes.Title; //bookTitleHyperlink.NavigateUrl = item.DetailPageURL; if (item.OfferSummary.LowestNewPrice == null) { if (item.OfferSummary.LowestUsedPrice == null) { //priceHyperlink.Visible = false; } else { //priceHyperlink.Text = string.Format("Buy used {0}", item.OfferSummary.LowestUsedPrice.FormattedPrice); //priceHyperlink.NavigateUrl = item.DetailPageURL; } } else { //priceHyperlink.Text = string.Format("Buy new {0}", item.OfferSummary.LowestNewPrice.FormattedPrice); //priceHyperlink.NavigateUrl = item.DetailPageURL; } if (item.ItemAttributes.Author != null) { //authorLabel.Text = string.Format("By {0}", string.Join(", ", item.ItemAttributes.Author)); } else { //authorLabel.Text = string.Format("By {0}", string.Join(", ", item.ItemAttributes.Creator.Select(c => c.Value).ToArray())); } /* ItemLink link = item.ItemLinks.Where(i => i.Description.Contains("Wishlist")).FirstOrDefault(); if (link == null) { //wishListHyperlink.Visible = false; } else { //wishListHyperlink.NavigateUrl = link.URL; } * */ } } } the problem is with this: thisshould be defined differently but i do not know how AWSECommerceServicePortTypeChannel service = new AWSECommerceServicePortTypeChannel();

    Read the article

  • Amazon AMIs and Oracle VM templates

    - by llaszews
    I have worked with Oracle VM templates and most recently with Amazon Machine Images (AMI). The similarities in the functionality and capabilities they provide are striking. Just take a look a the definitions: An Amazon Machine Image (AMI) is a special type of pre-configured operating system and virtual application software which is used to create a virtual machine within the Amazon Elastic Compute Cloud (EC2). It serves as the basic unit of deployment for services delivered using EC2. AWS AMIs Oracle VM Templates provide an innovative approach to deploying a fully configured software stack by offering pre-installed and pre-configured software images. Use of Oracle VM Templates eliminates the installation and configuration costs, and reduces the ongoing maintenance costs helping organizations achieve faster time to market and lower cost of operations. Oracle VM Templates Other things they have in common: 1. Both have 35 Oracle images or templates: AWS AMI pre-built images Oracle pre-built VM Templates 2. Both allow to build your own images or templates: A. OVM template builder - OVM Template Builder - Oracle VM Template Builder, an open source, graphical utility that makes it easy to use Oracle Enterprise Linux “Just enough OS” (JeOS)–based scripts for developing pre-packaged virtual machines for Oracle VM. B. AMI 'builder' - AMI builder However, AWS has the added feature/benefit of adding your own AMI to the AWS AMI catalog: AMI - Adding to the AWS AMI catalog Another plus with AWS and AMI is there are hundreds of MySQL AMIs (AWS MySQL AMIs ). A benefit of Oracle VM templates is they can run on any public or private cloud environment, not just AWS EC2. However, with Oracle VM templates they first need to be images as AMIs before they can run in the AWS cloud.

    Read the article

  • EC2 persistence of machine

    - by Seagull
    I want to 'persist' my Amazon EC2 images. My scenario: I have a range of Windows and Linux machines Some machines are EBS backed, whereas others are S3 backed. I need to be able to persist a machine (put it to sleep), preferably keeping all settings active I had them when the machine was running. I need to be able to quickly wake up a machine from sleep [Ideally with an SLA of less than 2 min to turn-on, if such an SLA is available with Amazon]. Here's the stuff that confuses me: AWS allows me to put EBS backed machines to sleep, but not S3 backed. I believe I can put S3 machines into some sort of persistence mode. But this involves shutting down the machine, writing it to S3 storage and then recovering from there (not a real sleep mode, but at least I don't continue to get billed for CPU). S3 backing seems to take a long time to either writing a machine to disk, or to recover (turn on a machine). I can't tell immediately which machines are EBS backed and which are S3 backed? It seems like I can instantiate either type, but it's not immediately clear how Amazon decided whether a given machine should be EBS or S3 backed. Advice?

    Read the article

  • List DB2 version, OS and hardware on Linux? (aws image)

    - by mestika
    Hello everybody, I'm not that familiar with Linux but I'm currently working on a aws image for an assignment and I need to display the DB2 version, the OS and the hardware. Is there a commando or program of some sort I can use for this purpose? I tried a rpm called "Bonnie" but that only writes the throughput for the system. Thanks Mestika

    Read the article

  • List DB2 version, OS and hardware on Linux? (aws image)H

    - by mestika
    Hello everybody, I'm not that familiar with Linux but I'm currently working on a aws image for an assignment and I need to display the DB2 version, the OS and the hardware. Is there a commando or program of some sort I can use for this purpose? I tried a rpm called "Bonnie" but that only writes the throughput for the system. Thanks Mestika

    Read the article

  • AWS Free Usage Tier + Cloudflare... possible?

    - by crashintoty
    If I throw my MySQL/PHP app up on a Amazon EC2 instance (using their AWS Free Usage Tier program) and couple it with CloudFlare (the free plan of course) roughly how many daily visitors can I comfortably handle before performance starts to suffer? Just looking for a rough estimate or educated guess - I understand this setup might be less than ideal but I'm still very curious nonetheless. Thanks in advance

    Read the article

  • Why do AWS spot-instance prices spike above the "on demand" pricing?

    - by Laykes
    Amazon Pricing on Spot Instance Inconsistencies This is something which will be best explained through screenshots of a historical chart of instance pricings. If you look at a lot of the instance prices for spot instances, you will notice regular patterns of spikes. See here: As you can see, the price for this compute medium instance, regularly spikes above the on demand price. A c1.medium instance (on demand), would only cost $0.186 per hour. But for a period of a few weeks, in zone B, the price would regularly spike to $1.20. This is some 6 times the actual on demand price. It's also not isolated. If you look at zone-b again for small instances, there is a similar, spike frequently. Which goes 4x the on demand pricing. Does anyone know why this happens? Here are a few suggestions Someone entered $1.2 instead of $0.12 (I would discount this since it happened 20 times over the space of 3 weeks). Amazon regularly artifically inflate their prices by bidding on their own instances to get the most bang for their buck. (I would discount this since it would be ridiculous and bad business) Some company launched 1000 servers at once, and wants to make sure that they all launch. (I would discount this since they would presumably launch them at a price which would be below the minimum on demand price. Why would you pay above on demand for a single server?). It's a bug in their reporting?

    Read the article

  • Can't Connect to IIS Ftp Site under Amazon EC2

    - by h3n
    IIS 7.5: Ftp Firewall Suport: Data Ranges 49152-65535 using external Ip of Amazon EC2 static IP Ftp IPv4 Restriction: allow: Amazon EC2 static IP Ftp Authentication: Anonymous: Enabled, Basic: Disabled, IISMgr: Enabled Ftp Authorization: Allow All Users: Read/Write Windows Firewall (Inbound): Open port 21 Open port ranges: 49152-65535 (Outbound) Open port: 20 Amazon EC2 Security Group: Custom TCP Rule: 21 Custom TCP Rule: 49152-65535 It works on Internet Explorer when I typed the address: ftp://localhost on the server but when I entered the Amazon EC2 Static IP (ftp://IPADRESS) it doesnt connect. I cant connect also to FileZilla

    Read the article

  • Why does my custom Amazon EC2 AMI have limited instance type options?

    - by John
    The Basic 64-bit Amazon Linux AMI has the following instance type options available: Micro Large Extra-Large High-Memory Extra Large ... etc I booted up this AMI as a micro type, made customizations, shut it down, detached the volume, took a snapshot, and registered my own custom AMI: ec2-register –snapshot [snapshot_id] –description "my description" –name "my name" –kernel aki-427d952b That worked. HOWEVER, when I try to create an instance from my custom AMI, only the following instance types are available: Micro Small High-CPU Medium ... which coincidentally are the same instance types available if you try to boot up the 32-bit Amazon image. Why are the available instance types of my custom image varying from the available instance types of the image I based it off of?

    Read the article

  • Are whole VM images backed up on Amazon EC2/S3?

    - by John
    I've been trying to get my head around Amazon Web Services as a VPS provider. My understanding is a EC2 instance running Windows is basically a Windows VM, very similar to renting a VPS from a more traditional hosting provider. I don't want to have complex backups, either to administer or to restore - if my restore involves installing SVN, MySQL, Jira, etc on a new box before I can even try to restore the backup then it's not great to me. What I really want is a service which backs up my entire VM... if the PC running the VPS dies then the VM image is installed on a new PC and off we go again. With Amazon being all about flexibility and elasticity, I wondered if they have this service? I can't figure it out from reading their docs.

    Read the article

  • Why do I get "Permission denied (publickey)" when trying to SSH from local Ubuntu to a Amazon EC2 se

    - by Vorleak Chy
    I have an instance of an application running in the cloud on Amazon EC2 instance, and I need to connect it from my local Ubuntu. It works fine on one of local ubuntu and also laptop. I got message "Permission denied (publickey)" when trying to access SSH to EC2 on another local Ubuntu. It's so strange to me. I'm thinking some sort of problems with security settings on the Amazon EC2 which has limited IPs access to one instance or certificate may need to regenerate. Does anyone know a solution?

    Read the article

  • How to Setting up Amazon EC2 with own OS and DB ?

    - by Spencer Lim
    i got my own version of OS and DB which are window server 2008 Hyper-V R2 and Sql server R2 2008 both in enterprise version may i know how to configure it up and running ? with amazon EC2, what other is a must combination to make it run ? also how could i install the operating system and DNS ? i never doing server before, but i just need something like VPS to support my development and testing. Amazon Ec2 seem the best and cheapest service due to only $1 per hour. Appreciate if Any brief guide provided, Thx =D

    Read the article

  • How to Setting up Amazon EC2 with own OS and DB?

    - by SLim
    i got my own version of OS and DB which are window server 2008 Hyper-V R2 and Sql server R2 2008 both in enterprise version may i know how to configure it up and running ? with amazon EC2, what other is a must combination to make it run ? also how could i install the operating system and DNS ? i never doing server before, but i just need something like VPS to support my development and testing. Amazon Ec2 seem the best and cheapest service due to only $1 per hour.

    Read the article

< Previous Page | 9 10 11 12 13 14 15 16 17 18 19 20  | Next Page >