Search Results

Search found 1351 results on 55 pages for 'aws s3'.

Page 20/55 | < Previous Page | 16 17 18 19 20 21 22 23 24 25 26 27  | Next Page >

  • Running multiple environments on one AWS EC2 instance (Elastic Beanstalk)

    - by Abbas
    I am very new to the Amazon AWS services. I was wondering if there is a way to run an instance of EC2 (say, Amazon Linux AMI) and then connect two environments to this instance. Particularly, I'd like to run a PHP and a Tomcat environment on a single EC2 instance. The problem is, every time I create a new environment in Elastic Beanstalk, it seems to create a new EC2 instance as well. Am I missing something here? I'd appreciate any hint on this.

    Read the article

  • Quantity in amazon cart?

    - by user201140
    Hi I'd like to change the quantity in an AWS order. I've used the form below and that works fine for adding 1 item to the cart, but when I try to change it to a quantity of 2 it says there are no items in my cart. What am I doing wrong? Thanks. <form method="GET" action="http://www.amazon.com/gp/aws/cart/add.html" id="Quantity"> <input type="hidden" name="AWSAccessKeyId" value="Access Key ID" /> <input type="hidden" name="AssociateTag" value="Associate Tag" /> <?php echo "<input type='hidden' name='ASIN.1' value='".$itemid."'/>"; ?> <input type="hidden" name="Quantity.1" value="1"/><br/> <input type="image" src="buynow.gif" value="Submit" alt="Submit"> </form>

    Read the article

  • How to run AWS sample JAVA code on an EC2

    - by SeaPlusPlus
    I just started with Amazon web services, and I have an EC2 instance. I downloaded the JAVA SDK and the Eclipse toolbox. I am able to run a sample program locally on my PC and connect to the Amazon databases, etc. My question is, what do I need to do to get this working on my EC2 instance? This may not even be specific to AWS. On Eclipse, I can just "Run as Application" and run any code. On the server side, what do I need to do? Should I ftp over my .java files? Should I export it to a jar and upload that? Do I need to install anything special to actually run it? I'm just trying to run the basic DynamoDB example that connects to the database and adds a new table and row

    Read the article

  • Does Iphone supports background process or services?

    - by Fedrick
    Hi all, I am planning to develop a iphone client application to upload images from iphone gallery to amazon s3 using rest calls.so is there any library to run this application as a background process in iphone. Also is there any library to access the iphone photo gallery(Should be able access all the images,not only selected one like in UIImagePickerController) Thanks in advance for stack overflow masters for sharing their knowledge.....

    Read the article

  • Is there any sync services library for iphone

    - by Fedrick
    Hi all, i am developing a mobile client to sync images from iphone photo gallery to amazon s3,so is there any sync services libraries that can help me in this regard.. Also is there any library to access the iphone photo gallery,I just wanted to pick all photos, randomly, from the images stored on the device with no user interaction? Thanks in advance.......

    Read the article

  • How can I script an alert for when my Amazon Web Service usage goes above a certain amount?

    - by frabcus
    We're using S3, SimpleDB and SQS on quite a complicated project. I'd like to be able to automatically track their usage, to be sure we don't suddenly spend large amounts of money when we didn't intend to (perhaps because of a bug). Is there a way of reading the usage figures of all Amazon Web Services and/or the current real time dollar cost of an account from a script? Or any service or script which provides alerts based on that?

    Read the article

  • Cache front end for the JetS3t API

    - by Joshua
    Storage via JetS3t REST API seems to be very slow. Is there a caching front end for the JetS3t API for avoiding a network hit on the fetch calls [link text][1] [1]: http://jets3t.s3.amazonaws.com/api/org/jets3t/service/S3Service.html#getObject(org.jets3t.service.model.S3Bucket, java.lang.String, java.util.Calendar, java.util.Calendar, java.lang.String[], java.lang.String[], java.lang.Long, java.lang.Long)

    Read the article

  • Conditional action based on whether any file in a directory has a ctime newer than X

    - by jberryman
    I would like to run a backup job on a directory tree from a bash script if any of the files have been modified in the last 30 minutes. I think I can hack together something using find with the -ctime flag, but I'm sure there is a standard way to examine a directory for changes. I know that I can inspect the ctime of the top level directory to see if files were added, but I need to be able to see changes also. FWIW, I am using duplicity to backup directories to S3.

    Read the article

  • Automatically Login & Startup A Windows Program On Amazon's EC2 Service

    - by darkAsPitch
    How can I automatically start a program on Amazon's EC2 Windows 2008 web servers? For example, if I wanted to test the "Digg effect" on a web page of mine, how could I open 100 windows 2008 servers at once, each loading one (or two?) instances of the firefox web browser? I have placed a sample batch file in the windows startup folder that echos the time it was called, but it is only started when I actually login remotely via the remote desktop protocol. I don't want to have to login to 100 servers to get my software to run :P What can I do? I am using this Windows 2008 Datacenter, Amazon-supplied AMI specifically: ami-a2698bcb

    Read the article

  • How to deploy local project to Amazon

    - by Nai
    I have a small webapp written in Python/Django which works fine on my local machine. I've been tinkering and setting up my server on the free tier of Amazon EC2 by following online tutorials. However, the tutorials I have found so far shows you how to setup your instance and stops there. So my question is, how do I get my local webapp onto my Amazon instance? FYI, I'm a sys admin/web dev. noob. Thanks.

    Read the article

  • Django Upload form to S3 img and form validation

    - by citadelgrad
    I'm fairly new to both Django and Python. This is my first time using forms and upload files with django. I can get the uploads and saves to the database to work fine but it fails to valid email or check if the users selected a file to upload. I've spent a lot of time reading documentation trying to figure this out. Thanks! views.py def submit_photo(request): if request.method == 'POST': def store_in_s3(filename, content): conn = S3Connection(AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY) bucket = conn.create_bucket(AWS_STORAGE_BUCKET_NAME) mime = mimetypes.guess_type(filename)[0] k = Key(bucket) k.key = filename k.set_metadata("Content-Type", mime) k.set_contents_from_file(content) k.set_acl('public-read') if imghdr.what(request.FILES['image_url']): qw = request.FILES['image_url'] filename = qw.name image = filename content = qw.file url = "http://bpd-public.s3.amazonaws.com/" + image data = {image_url : url, user_email : request.POST['user_email'], user_twittername : request.POST['user_twittername'], user_website : request.POST['user_website'], user_desc : request.POST['user_desc']} s = BeerPhotos(data) if s.is_valid(): #import pdb; pdb.set_trace() s.save() store_in_s3(filename, content) return HttpResponseRedirect(reverse('photos.views.thanks')) return s.errors else: return errors else: form = BeerPhotoForm() return render_to_response('photos/submit_photos.html', locals(),context_instance=RequestContext(request) forms.py class BeerPhotoForm(forms.Form): image_url = forms.ImageField(widget=forms.FileInput, required=True,label='Beer',help_text='Select a image of no more than 2MB.') user_email = forms.EmailField(required=True,help_text='Please type a valid e-mail address.') user_twittername = forms.CharField() user_website = forms.URLField(max_length=128,) user_desc = forms.CharField(required=True,widget=forms.Textarea,label='Description',) template.html <div id="stylized" class="myform"> <form action="." method="post" enctype="multipart/form-data" width="450px"> <h1>Photo Submission</h1> {% for field in form %} {{ field.errors }} {{ field.label_tag }} {{ field }} {% endfor %} <label><span>Click here</span></label> <input type="submit" class="greenbutton" value="Submit your Photo" /> </form> </div>

    Read the article

  • Restoring WordPress EC2 instance from snapshot results in 403 Forbidden error

    - by Eric Matthew Turano
    This problem has been perplexing me for weeks now. Here's how the issue goes: Launch Amazon Linux 64-bit instance, successfully install WordPress, and site is active w/ no issues Create snapshot of the instance's root volume Shut down instance Create volume from snapshot, attach to instance, and reboot instance Associate Elastic IP with instance Once that's done and I try logging onto the site, I am redirected to myurl.com/wp-admin/install.php and greeted with this message: Forbidden: You don't have permission to access /wp-admin/install.php on this server. Apache/2.2.25 (Amazon) Server at www.myurl.com Port 80 Port 80 is open on the inbound security group settings, so that's not the issue. Keep in mind all I am doing is merely creating a new volume and attaching it to the same instance, and this issue comes up. What am I doing wrong, and how can I create a complete backup of my instance without this error occuring?

    Read the article

  • Does cloud storage replicate the data over many datacenters if so it means i benefit content delive

    - by Berkay
    Let's assume that i want to use cloud storage service from one of the cloud storage provider, i got X gb structured and unstructured data and i will use this data as my contents of my interactive web page. And now i have some doubts about this point.I have many users and they are all visiting my web page from various countries.To be more specific first; does my data stored only of the Cloud Storage data center ? or Does my data replicated over many data centers of my provider? second if so; how can i benefit from content delivery network? (matching and placing users’ content nearest storage data centers)

    Read the article

  • Is there the equivalent of cloud computing for modems?

    - by morpheous
    I asked this question on SF, and someone recommended that I ask it here - (I don't think I have enough points to move a question from SF to SO - and in any case, I don't know how to do it - so here is the question again): I am interested in the concept of PAAS (platform as a service). However, all talk about SAAS/PAAS seems to focus on only the computer itself - not its peripherals. Is it possible to 'outsource' modems as a resource - so that an app running remotely can pump data to a modem in the cloud? As a bit of background to the question, a group of us are thinking of starting a company that offers similar services to companies like twilio etc - but I want to 'outsource' both the computing hardware (thats PAAS - the easy bit) and the modems (thats what I cant seem to find any info on). Does anyone know if modems can be bundled as part of a PAAS service? - alternatively, is there a way that an application running on one computer can communicate (i.e. pump data) to a remote modem residing on another machine?. I assume I can come up with some protocol over UDP or TCP - but there is no point reinventing the wheel - if such a protocol like that already exists (or if it some open source software allows one to do this). Any suggestions on how to solve this problem?

    Read the article

  • Can't seem to sign AWS Cloud Front URL properly

    - by Joe Corkery
    Hi everybody, I found a lot of detailed examples online on how to sign an Amazon CloudFront URL for private content. Unfortunately, whenever I implement these examples my URL doesn't seem to work. The resource path is correct because I can download the file when it is set for world read, but the URL doesn't work when set just for authorized users. The PHP code I am using is below. If anybody has any insights as to what I might be doing wrong (I'm guessing it is something obvious that I am just not seeing right now), it would be greatly appreciated. function urlCloudFront($resource) { $AWS_CF_KEY = 'APKA...'; $priv_key = file_get_contents(path_to_pem_file); $pkeyid = openssl_get_privatekey($priv_key); $expires = strtotime("+ 3 hours"); $policy_str = '{"Statement":[{"Resource":"'.$resource.'","Condition":{"DateLessThan":{"AWS:EpochTime":'.$expires.'}}}]}'; $policy_str = trim( preg_replace( '/\s+/', '', $policy_str ) ); $res = openssl_sign($policy_str, $signature, $pkeyid, OPENSSL_ALGO_SHA1); $signature_base64 = (base64_encode($signature)); $repl = array('+' => '-','=' => '_','/' => '~'); $signature_base64 = strtr($signature_base64,$repl); $url = $resource . '?Expires=' .$expires. '&Signature=' . $signature_base64 . '&Key-Pair-Id='. $AWS_CF_KEY; print '<p><A href="' .$url. '">Download VIDA (CloudFrount)</A>'; } urlCloudFront("http://mydistcloud.cloudfront.net/mydir/myfile.tar.gz"); Thanks.

    Read the article

  • Subsequent runs of rsync locally don't reduce data transferred

    - by sharakan
    I have an EC2 instance with data I want to sync to a mounted, but remote, volume, as a backup. rsync seems like the way to go with this, so as a test I took my test file (a Postgres pg_dump file) and used rsync -v to copy it to the mounted volume: [ec2-user work]$ rsync -v dump.sql.1 ../backup/dump.sql dump.sql.1 sent 821704315 bytes received 31 bytes 3416650.09 bytes/sec total size is 821603948 speedup is 1.00 Then, I ran it again, expecting to see minimal sent/received numbers because it would just be checksums. Instead... [ec2-user work]$ rsync -v dump.sql.1 ../backup/dump.sql dump.sql.1 sent 821704315 bytes received 31 bytes 3402502.47 bytes/sec total size is 821603948 speedup is 1.00 I'm new to rsync so perhaps I'm missing something, but isn't the idea that the source and destination files are checked for differences, and then a patch is generated and applied to the destination? Why is this not reducing the amount of data 'sent' to just the size of the checksums? Some background if it's relevant: the mounted volume is using s3fs, mounted with s3fs <bucketname> backup.

    Read the article

  • How can I deploy my .NET app to Amazon EC2?

    - by Khash
    I have a .NET Windows service and a .NET Web Application that I would like to deploy to my Amazon EC2 Windows 2008 instances. At this point, all I need to do is to copy the zipped files across to the EC2 box and remote desktop to the EC2 instance and finish the deployment. In order to do this, I have tried LogMeIn Hamachi2 to create a P2P VPN and use RoboCopy to copy the files, however it seems Hamachi doesn't work on Windows EC2. What is your solution for deploying your .NET apps to Windows EC2 instances? I want to avoid running an FTP server on the box just to get my files up on the server and don't have a VPN server (like OpenVPN) running to run a cloud based VPN solution. Perhaps I can find a simple way of using Amazon S3 as a strategy? Any ideas? Suggestions?

    Read the article

  • Distributed datastore

    - by Julien Genestoux
    We're trying to add some kind of persistence in our app. The app generates about 250 entries per second. Each of these entries belong to one of 2M files. For each file, we want to keep the last 10 entries, so we can look them up later. The way our client application works : it gets a stream of all the data it fetches the right file (GET) it adds the new content it saves the file back (PUT) We're looking for an efficient way to store this data that can scale horizontally as the amount of data we're getting is doubling every few weeks. We initially looked at S3. It works fine, but becomes very expensive very fast ($1000 monthly just in PUT operations!) We then gave a shot at Riak. But it seems we can't get more than 60 write/sec on each node, which is very very slow. Any other solution out there?

    Read the article

  • Serving files over HTTPS dynamically based on request.ssl? with Attachment_fu

    - by Marston A.
    I see there is a :user_ssl option in attachment_fu which checks the amazon_s3.yml file in order to serve files via https:// In the s3_backend.rb you have this method: def self.protocol @protocol ||= s3_config[:use_ssl] ? 'https://' : 'http://' end But this then makes it serve ALL s3 attachments with SSL. I'd like to make it dynamic depending if the current request was made with https:// i.e: if request.ssl? @protocol = "https://" else @protocol = "http://" end How can I make it work in this way? I've tried modifying the method and then get the NameError: undefined local variable or method `request' for Technoweenie::AttachmentFu::Backends::S3Backend:Module error

    Read the article

  • Setting up scripts in Amazon EC2 Cloud

    - by racket99
    Hello, I am currently running a few perl and python scripts on a windows pc and would like to port over to the Amazon EC2 servers running 64-bit LINUX. The scripts are basic web scrapers that go to a variety of websites, get data and then save daily as csv files. I would like to install these in the cloud and get them running in an automated way so that they will run without my intervention. Also given that I don't want to lose all the data if the instance crashes, I should also upload the csv files to Amazon S3. Any idea how I can do this? I am not terribly versed in LINUX nor do I know Perl/Python well. What is the best way for me to tackle thi

    Read the article

  • Heroku and Refinerycms: Application failed to start ~ attachment_fu problem

    - by John Deely
    Ok so I'm trying to get Refinerycms working with Heroku, and I'm new at all of this. I've set up an amazon s3 account and added keys and ids to the amazon_s3.yml files. When launched on Heroku at gart.heroku.com I get the following error: App failed to start /disk1/home/slugs/141557_e8490b3_d5eb/mnt/vendor/plugins/attachment_fu/lib/technoweenie/attachment_fu/backends/s3_backend.rb:187:in read': No such file or directory - /disk1/home/slugs/141557_e8490b3_d5eb/mnt/config/amazon_s3.yml (Errno::ENOENT) from /disk1/home/slugs/141557_e8490b3_d5eb/mnt/vendor/plugins/attachment_fu/lib/technoweenie/attachment_fu/backends/s3_backend.rb:187:inincluded' from /disk1/home/slugs/141557_e8490b3_d5eb/mnt/vendor/plugins/attachment_fu/lib/technoweenie/attachment_fu.rb:123:in include' from /disk1/home/slugs/141557_e8490b3_d5eb/mnt/vendor/plugins/attachment_fu/lib/technoweenie/attachment_fu.rb:123:inhas_attachment' from /disk1/home/slugs/141557_e8490b3_d5eb/mnt/app/models/image.rb:13 from /usr/local/lib/ruby/site_ruby/1.8/rubygems/custom_require.rb:31:in gem_original_require' from /usr/local/lib/ruby/site_ruby/1.8/rubygems/custom_require.rb:31:inrequire' from /usr/local/lib/ruby/gems/1.8/gems/activesupport-2.3.5/lib/active_support/dependencies.rb:158:in require' from /usr/local/lib/ruby/gems/1.8/gems/activesupport-2.3.5/lib/active_support/dependencies.rb:265:inrequire_or_load' ... 42 levels... from /usr/local/lib/ruby/gems/1.8/gems/rack-1.0.1/lib/rack/builder.rb:29:in instance_eval' from /usr/local/lib/ruby/gems/1.8/gems/rack-1.0.1/lib/rack/builder.rb:29:ininitialize' from /home/heroku_rack/heroku.ru:1:in `new' from /home/heroku_rack/heroku.ru:1 The s3_backend.rb line 187 contains: @@s3_config = @@s3_config = YAML.load(ERB.new(File.read(@@s3_config_path)).result)[RAILS_ENV].symbolize_keys Any help would be great!

    Read the article

  • How to extend an existing Ruby on Rails CMS to host multiple sites?

    - by Andrew
    I am trying to build a CMS I can use to host multiple sites. I know I'm going to end up reinventing the wheel a million times with this project, so I'm thinking about extending an existing open source Ruby on Rails CMS to meet my needs. One of those needs is to be able to run multiple sites, while using only one code-base. That way, when there's an update I want to make, I can update it in one place, and the change is reflected on all of the sites. I think that this will be able to scale by running multiple instances of the application. I think that I can use the domain/subdomain to determine which data to display. For example, someone goes to subdomain1.mysite.com and the application looks in the database for the content for subdomain1. The problem I see is with most pre-built CMS solutions, they are only designed to host one site, including the one I want to use. So the database is structured to work with one site. However, I had the idea that I could overcome this by "creating a new database" for each site, then specifying which database to connect to based on the domain/subdomain as I mentioned above. I'm thinking of hosting this on Heroku, so I'm wondering what my options for this might be. I'm not very familiar with Amazon S3, or Amazon SimpleDB, but I feel like there's some sort of "cloud database" that would make this solution a lot more realistic, than creating a new MySQL database for each site. What do you think? Am I thinking about this the wrong way? What advice do you have to offer in this area?

    Read the article

  • Wake On Lan (WOL) for Realtek RTL8101E/RTL8102E

    - by Heisennberg
    I'm unsuccessfully trying to get Wake on Lan to work with my local server (IP Address : 192.168.0.2, distro Ubuntu 12.04.3 LTS) which has a Realtek RTL8101E/RTL8102E ethernet card. The computer sending the WOL is a Macbook Pro which is connected on the same network. Yet the server fails to start. Here what I have done so far : name@serverName ~ $ cat /proc/acpi/wakeup Device S-state Status Sysfs node HDEF S3 *disabled pci:0000:00:1b.0 PXSX S3 *disabled PXSX S0 *enabled pci:0000:04:00.0 PXSX S0 *disabled USB1 S3 *enabled pci:0000:00:1d.0 USB2 S3 *enabled pci:0000:00:1d.1 USB3 S3 *enabled pci:0000:00:1d.2 USB5 S3 *enabled pci:0000:00:1a.1 EHC1 S3 *enabled pci:0000:00:1d.7 EHC2 S3 *enabled pci:0000:00:1a.7 name@serverName ~ $ lspci ------ 04:00.0 Ethernet controller: Realtek Semiconductor Co., Ltd. RTL8101E/RTL8102E PCI Express Fast Ethernet controller (rev 01) ------ name@serverName ~ $ sudo ethtool eth0 Settings for eth0: Supported ports: [ TP MII ] Supported link modes: 10baseT/Half 10baseT/Full 100baseT/Half 100baseT/Full Supported pause frame use: No Supports auto-negotiation: Yes Advertised link modes: 10baseT/Half 10baseT/Full 100baseT/Half 100baseT/Full Advertised pause frame use: Symmetric Receive-only Advertised auto-negotiation: Yes Link partner advertised link modes: 10baseT/Half 10baseT/Full 100baseT/Half 100baseT/Full Link partner advertised pause frame use: Symmetric Receive-only Link partner advertised auto-negotiation: Yes Speed: 100Mb/s Duplex: Full Port: MII PHYAD: 0 Transceiver: internal Auto-negotiation: on Supports Wake-on: pumbg Wake-on: g Current message level: 0x00000033 (51) drv probe ifdown ifup Link detected: yes and I'm calling the WOL with : name@serverName ~ $ wakeonlan xx:xx:xx:xx:xx` Sending magic packet to 255.255.255.255:9 with xx:xx:xx:xx:xx I have succesfully activated the WOL option in my computer BIOS. Any idea ?

    Read the article

  • SAP : HANA débarque sur AWS, l'éditeur continue d'imaginer des offres pour démocratiser sa base de données en mémoire

    SAP HANA débarque sur Amazon Web Services L'éditeur continue d'imaginer des offres pour démocratiser sa base de données en mémoire Pour bénéficier de ? ou tester ? la base de données « in-memory » de SAP, plus besoin d'une appliance ou d'un serveur. L'éditeur Allemand vient en effet de certifier HANA à l'usage des développeurs sur Amazon Web Services. « SAP HANA One est disponible sur le marketplace d'AWS », annonce SAP. « Il est maintenant possible de faire fonctionner cette puissante base de données en mémoire sur AWS EC2 pour $ 0,99 de l'heure ». [IMG]http://ftp-developpez.com/gordon-fowler/SAP%20HANA.png[/IMG] Pour rappel, ...

    Read the article

< Previous Page | 16 17 18 19 20 21 22 23 24 25 26 27  | Next Page >