Search Results

Search found 9847 results on 394 pages for 'cloud backup'.

Page 116/394 | < Previous Page | 112 113 114 115 116 117 118 119 120 121 122 123  | Next Page >

  • Sortie d'OpenSuse 12.1 : GNOME 3.2, KDE 4.7, Go, support amélioré du Cloud et la virtualisation plus stable et flexible

    Sortie d'OpenSuse 12.1 : GNOME 3.2, KDE 4.7, Go support amélioré du Cloud et la virtualisation pour l'OS plus stable et flexible Une nouvelle version majeure de la distribution Linux OpenSuse est disponible. Après plus de huit mois de travaux, la version 12.1 du système d'exploitation qui vient d'être publiée apporte un nombre impressionnant de nouveautés avec un support du Cloud et de la virtualisation. L'OS basé sur le noyau Linux 3.1, embarque l'environnement de bureau GNOME Shell 3.2, avec une meilleure gestion des écrans plus petits, des configurations multiécrans, de meilleures notifications et un système centralisé de configuration de comptes en ligne.

    Read the article

  • Oracle sort Solaris 11 pour les architectures SPARC et x86 et l'annonce comme le "premier OS Cloud"

    Oracle sort Solaris 11 pour les architectures SPARC et x86 Et l'annonce comme le "premier OS Cloud" Mise à jour du 10 novembre 2011 par Idelways Le grand chantier de Sun, puis d'Oracle Solaris 11 est enfin achevé. L'implémentation Unix au nom de code « Nevada » sort pour les architectures Sparc et x86 et s'annonce par ses créateurs comme « le premier OS Cloud ». Cette communication d'Oracle autour de Solaris 11 s'inscrit dans le sillage de son ado...

    Read the article

  • OpenStack s'ouvre à Java et PHP, Rackspace publie deux kits de développement pour la plateforme Cloud open source

    OpenStack s'ouvre à PHP et Java Rackspace publie deux kits de développement pour la plateforme Cloud open source OpenStack, la plateforme de Cloud computing public et privé open source qui bénéficie du support des géants de l'IT comme Cisco ou encore HP, est actuellement sous les feux des projecteurs à l'occasion de la conférence OpenStack summit qui se déroule actuellement à San Diego. [IMG]http://idelways.developpez.com/news/images/openstack-logo.gif[/IMG] L'évènement annuel qui s'achèvera le 18 octobre est l'occasion pour les différents acteurs qui utilisent l'écosystème de présenter leurs produits et les évolutions de la plateforme.

    Read the article

  • System Center 2012 : la plateforme de gestion des infrastructures de Cloud privé de Microsoft est disponible en version finale

    System Center 2012 : la plateforme de gestion des infrastructures de Cloud privé de Microsoft est disponible en version finale Mise à jour du 04/04/2012 System Center 2012, la plate-forme complète pour l'administration des postes de travail, des serveurs, des applications et des périphériques, en environnement physique ou virtuel est disponible en version finale. La plateforme regroupe au sein d'une seule solution unifiée, huit produits distincts permettant de déployer des services sur le Cloud, d'assurer la protection des données, de gérer les autres dispositifs non-Microsoft à l'instar de l'iPad, etc. (lire ci-avant). ...

    Read the article

  • Les premiers résultats de Numergy sont-ils bons ? Le Cloud "souverain" français fête ses un an et continue d'embaucher des profils techniques

    Les premiers résultats de Numergy sont-ils bons ? Le Cloud "souverain" français fête ses un an et continue d'embaucherLe Cloud "souverain" français Numergy vient de souffler sa première bougie. En pleine affaire PRISM, cet anniversaire était l'occasion de revenir avec un de ses porte-paroles sur les premiers résultats et les objectifs de ce projet épaulé par SFR, Bull et la Caisse des Dépôts (actionnaires à hauteur respectivement de 47 %, 20 % et 33%) qui continue d'avoir besoin...

    Read the article

  • Windows Azure fête ses un an, utilisez-vous et connaissez-vous bien la plateforme Cloud de Microsoft dédiée aux développeurs ?

    Windows Azure fête ses un an Utilisez-vous et connaissez-vous la plateforme Cloud de Microsoft dédiée aux développeurs ? En collaboration avec Hinault Romarick Le virage de Microsoft vers le Cloud Computing a été marqué par la sortie de plusieurs produits (Dynamic CRM OnLine, Office 365, les Office Web Apps, etc.). Mais c'est bien l'arrivée de sa plate-forme dédiée aux développeurs, Windows Azure, qui a montré son implication dans le IaaS (Infrastructure à la demande) et le PaaS (Plateforme à la demande). Présentée pour la première fois lors de la conférence PDC 2008, c'est en ...

    Read the article

  • How can I install Ubuntu on my Nexus 7 while being able to recover from an Nandroid backup?

    - by MagicFab
    I use CyanogenMod and ClockWork Recovery on my Nexus 7. How can my existing full nandroid backup be used to restore my device after installing Ubuntu? The instructions assume "recovery" would mean re-flashing the vanilla image, at factory, data-wiped condition. It would be useful to provide a .zip that can be flashed via Clockwork (or other) recovery usin ROM Manager or booting into recovery and back to whatever Nandroid backup there is - much as any other ROM is provided/used.

    Read the article

  • Finding distance to the closest point in a point cloud on an uniform grid

    - by erik
    I have a 3D grid of size AxBxC with equal distance, d, between the points in the grid. Given a number of points, what is the best way of finding the distance to the closest point for each grid point (Every grid point should contain the distance to the closest point in the point cloud) given the assumptions below? Assume that A, B and C are quite big in relation to d, giving a grid of maybe 500x500x500 and that there will be around 1 million points. Also assume that if the distance to the nearest point exceds a distance of D, we do not care about the nearest point distance, and it can safely be set to some large number (D is maybe 2 to 10 times d) Since there will be a great number of grid points and points to search from, a simple exhaustive: for each grid point: for each point: if distance between points < minDistance: minDistance = distance between points is not a good alternative. I was thinking of doing something along the lines of: create a container of size A*B*C where each element holds a container of points for each point: define indexX = round((point position x - grid min position x)/d) // same for y and z add the point to the correct index of the container for each grid point: search the container of that grid point and find the closest point if no points in container and D > 0.5d: search the 26 container indices nearest to the grid point for a closest point .. continue with next layer until a point is found or the distance to that layer is greater than D Basically: put the points in buckets and do a radial search outwards until a points is found for each grid point. Is this a good way of solving the problem, or are there better/faster ways? A solution which is good for parallelisation is preferred.

    Read the article

  • Azure application working on emulator but not on azure cloud

    - by Hisham Riaz
    firstly i am developing my MVC3 application on visual web developer 2010 express, by migrating my MVC3 (cshtml) files on MVC2. it works great on local system using the emulator, but once i deploy the application on azure it gives runtime errors. example: The layout page "~/Views/Shared/test_page.cshtml" could not be found at the following path: "~/Views/Shared/test_page.cshtml". Source Error: Line 8: //Layout = "~/Views/Shared/upload.cshtml"; Line 9: //Layout = "~/Views/Shared/_Layout2.cshtml"; Line 10: Layout = "~/Views/Shared/test_page.cshtml"; Line 11: } Line 12: else CODE IS AS FOLLOWS: _ViewStart.cshtml file @{ string AccId = Request.QueryString["AccId"].ToString(); if (AccId=="0") { //Layout = "~/Views/Shared/upload.cshtml"; //Layout = "~/Views/Shared/_Layout2.cshtml"; Layout = "~/Views/Shared/test_page.cshtml"; } else { string LayOutPagePath = MVCTest.Models.ComponentClass.GetLayOutPagePath(AccId); Layout = LayOutPagePath; } } ......... how ever the page exist, and is working fine on azure emulator, but not in azure cloud. CODE FOR test_page.cshtml @{ var result = "1234567890"; var temp_xml = MVCTest.Models.ComponentClass.GetTemplateAndTheme("1");//returning xml string LayOutPagePath = MVCTest.Models.ComponentClass.GetLayOutPagePath("1");//returning string } @RenderBody() test_page @temp_xml @result @LayOutPagePath

    Read the article

  • Titanium Appcelerator - After enabling cloud services I get UnicodeDecodeError when compiling for Android

    - by Shahar Zrihen
    I've got an app that has a UTF8 name (hebrew). I use the platform/android/AndroidManifest.xml file for this. I've managed to narrow it down to the Ti.cloudpush module. only when I enable this module I get the error. I used to be able to compile it to android without any issues but as soon as I enable cloud services I get this error - [ERROR] Exception occured while building Android project: [ERROR] Traceback (most recent call last): [ERROR] File "/Users/Shahar/Library/Application Support/Titanium/mobilesdk/osx/2.1.0.GA/android/builder.py", line 2218, in <module> [ERROR] s.build_and_run(True, None, key, password, alias, output_dir) [ERROR] File "/Users/Shahar/Library/Application Support/Titanium/mobilesdk/osx/2.1.0.GA/android/builder.py", line 1970, in build_and_run [ERROR] self.manifest_changed = self.generate_android_manifest(compiler) [ERROR] File "/Users/Shahar/Library/Application Support/Titanium/mobilesdk/osx/2.1.0.GA/android/builder.py", line 1195, in generate_android_manifest [ERROR] custom_manifest_contents = fill_manifest(custom_manifest_contents) [ERROR] File "/Users/Shahar/Library/Application Support/Titanium/mobilesdk/osx/2.1.0.GA/android/builder.py", line 1122, in fill_manifest [ERROR] manifest_source = manifest_source.replace(ti_permissions,permissions_required_xml) [ERROR] UnicodeDecodeError: 'ascii' codec can't decode byte 0xd7 in position 501: ordinal not in range(128) and this is my manifest file with the part that causes the issues. If I remove the hebrew name, it compiles without any issues <application android:icon="@drawable/appicon" android:label="??????" android:name="QuestionnaireApplication" android:debuggable="false" > <activity android:name=".QuestionnaireActivity" android:label="??????" android:theme="@style/Theme.Titanium" android:screenOrientation="portrait" android:configChanges="keyboardHidden" > Any suggestions?

    Read the article

  • Making a database backup to SDCard on Android

    - by Pentium10
    I am using the below code to write a backup copy to SDCard and I get java.io.IOException: Parent directory of file is not writable: /sdcard/mydbfile.db private class ExportDatabaseFileTask extends AsyncTask<String, Void, Boolean> { private final ProgressDialog dialog = new ProgressDialog(ctx); // can use UI thread here protected void onPreExecute() { this.dialog.setMessage("Exporting database..."); this.dialog.show(); } // automatically done on worker thread (separate from UI thread) protected Boolean doInBackground(final String... args) { File dbFile = new File(Environment.getDataDirectory() + "/data/com.mypkg/databases/mydbfile.db"); File exportDir = new File(Environment.getExternalStorageDirectory(), ""); if (!exportDir.exists()) { exportDir.mkdirs(); } File file = new File(exportDir, dbFile.getName()); try { file.createNewFile(); this.copyFile(dbFile, file); return true; } catch (IOException e) { Log.e("mypck", e.getMessage(), e); return false; } } // can use UI thread here protected void onPostExecute(final Boolean success) { if (this.dialog.isShowing()) { this.dialog.dismiss(); } if (success) { Toast.makeText(ctx, "Export successful!", Toast.LENGTH_SHORT).show(); } else { Toast.makeText(ctx, "Export failed", Toast.LENGTH_SHORT).show(); } } void copyFile(File src, File dst) throws IOException { FileChannel inChannel = new FileInputStream(src).getChannel(); FileChannel outChannel = new FileOutputStream(dst).getChannel(); try { inChannel.transferTo(0, inChannel.size(), outChannel); } finally { if (inChannel != null) inChannel.close(); if (outChannel != null) outChannel.close(); } } }

    Read the article

  • What is the suggested approach to Syncing/Backing up/Restoring from SQL Server 2008 to SQL Server 20

    - by Eoin Campbell
    I only have SQL Server 2008 (Dev Edition) on my development machine I only have SQL Server 2005 available with my hosting company (and I don't have direct connection access to this database) I'm just wondering what the best approach is for: Getting the initlal DB Structure & Data into production. And keeping any structural changes/data changes in sync in future. As far as I can see... Replication - not an option cos I can't connect to the production DB. Restoring a backup - not an option because as far as I can see, you cannot export a DB from 2008 that is restorable in 2005 (even with the 2008 DB set in 2005 compatibility mode) and it wouldn't make sense to be restoring production over the top of my dev version anyway. Dump all the scripts from my 2008 Database, Revert my Dev to machine from 2008 - 2005, and recreate the database from the scripts, then just use backup & restore to get the initial DB into production, then run scripts through the web panel from that point onwards Dump all the scripts from my 2008 Database and generate the entire 2005 db from scripts in production. then run scripts through the web panel from that point onwards With the last 2 options, I'd probably need to script all the data inserts as well using some tool (which I presume exists on the web) Are there any other possibile solutions that I'm not considering.

    Read the article

  • Time Machine for Windows?

    - by Blub
    Finally got convinced to start using some kind of version control for my code instead of zipping down a copy of the project at the end of each day. Downloaded Tortoise SVN and used it to create a repository localy on my hdd. I've been using it for 2 days now but I have to say that using it is actually more hassle than just copying the project manually in explorer. Sure, you only store incremental changes but with the cheap disks of today I can't really say that's an argument when you only have small projects. I haven't realy found a quick way to browse the older versions of my files eighter. What I want is an infinite undo that is completely transparent while I code, if I save the file I want a backup. I don't want to check out, check in and don't even get me started on moving files. I haven't tried Time Machine for OS X but it looks like it's exactly what I'm looking for. Does such a program exist for windows? Preferably free and with some kind of tagging-system so I can tag a timestamp when the project is working etc. Maybe should add that I mostly work alone on a single computer. Update: Some of you asked why I want backup. Since I work alone it's mostly to allow me to quickly hack up a solution without worrying that something will screw up.

    Read the article

  • Is the RESTORE process dependent on schema?

    - by Martin Aatmaa
    Let's say I have two database instances: InstanceA - Production server InstanceB - Test server My workflow is to deploy new schema changes to InstanceB first, test them, and then deploy them to InstanceA. So, at any one time, the instance schema relationship looks like this: InstanceA - Schema Version 1.5 InstanceB - Schema Version 1.6 (new version being tested) An additional part of my workflow is to keep the data in InstanceB as fresh as possible. To fulfill this, I am taking the database backups of InstanceA and applying them (restoring them) to InstanceB. My question is, how does schema version affect the restoral process? I know I can do this: Backup InstanceA - Schema Version 1.5 Restore to InstanceB - Schema Version 1.5 But can I do this? Backup InstanceA - Schema Version 1.5 Restore to InstanceB - Schema Version 1.6 (new version being tested) If no, what would the failure look like? If yes, would the type of schema change matter? For example, if Schema Version 1.6 differed from Schema Version 1.5 by just having an altered storec proc, I imagine that this type of schema change should't affect the restoral process. On the other hand, if Schema Version 1.6 differed from Schema Version 1.5 by having a different table definition (say, an additional column), I image this would affect the restoral process. I hope I've made this clear enough. Thanks in advance for any input!

    Read the article

  • Uniquely identify files/folders in NTFS, even after move/rename

    - by Felix Dombek
    I haven't found a backup (synchronization) program which does what I want so I'm thinking about writing my own. What I have now does the following: It goes through the data in the source and for every file which has its archive bit set OR does not exist in the destination, copies it to the destination, overwriting a possibly existing file. When done, it checks for all files in the destination if it exists in the source, and if it doesn't, deletes it. The problem is that if I move or rename a large folder, it first gets copied to the destination even though it is in principle already there, just has a different path. Then the folder which was already there is deleted afterwards. Apart from the unnecessary copying, I frequently run into space problems because my backup drive isn't large enough to hold the original data twice. Is there a way to programmatically identify such moved/renamed files or folders, i.e. by NTFS ID or physical location on media or something else? Are there solutions to this problem? I do not care about the programming language, but hints for doing this with Python, C++, C#, Java or Prolog are appreciated.

    Read the article

  • Azure : The process cannot access the file "" because it is being used by another process.

    - by Shantanu
    Hi all, I am trying to get a matlab-compiled exe running on Azure cloud, and for that purpose need to get a v78.zip onto the local storage of the cloud and unzip it, before I can try to run an exe on the cloud. The program works fine when executed locally, but on deployment gives and error at line marked below in the code. The error is : The process cannot access the file 'C:\Resources\directory\cc0a20f5c1314f299ade4973ff1f4cad.WebRole.LocalStorage1\v78.zip' because it is being used by another process. Exception Details: System.IO.IOException: The process cannot access the file 'C:\Resources\directory\cc0a20f5c1314f299ade4973ff1f4cad.WebRole.LocalStorage1\v78.zip' because it is being used by another process. The code is given below: string localPath = RoleEnvironment.GetLocalResource("LocalStorage1").RootPath; Response.Write(localPath + " \n"); Directory.SetCurrentDirectory(localPath); CloudBlob mblob = GetProgramContainer().GetBlobReference("v78.zip"); CloudBlockBlob mbblob = mblob.ToBlockBlob; CloudBlob zipblob = GetProgramContainer().GetBlobReference("7z.exe"); string zipPath = Path.Combine(localPath, "7z.exe"); string matlabPath = Path.Combine(localPath, "v78.zip"); IEnumerable<ListBlockItem> blocklist = mbblob.DownloadBlockList(); BlobStream stream = mbblob.OpenRead(); FileStream fs = File.Create(matlabPath); (Exception occurs here) It'll be great help if someone could tell me where I'm going wrong. Thanks! Shan

    Read the article

  • Linux & Windows Boot Up Times in Amazon Web Service and Windows Azure

    - by Adron
    I've been working with Windows Azure and Amazon Web Services EC2 for a good many months now (almost getting to the years range) and I've seen something over and over that seems troubling. With AWS & Linux I commonly get instance startup times with EC2 around the 1-3 minute range. With AWS & Windows OS on an EC2 instance it often takes 10-20 minutes. With Windows Azure Web or Service Role I often get anywhere from 6-30 minutes waiting for a role to startup. I assume of course this involves booting up a windows instance somewhere in the fabric. I know there has always been tons of FUD about windows vs. Linux, but I'd really like to know why it is that Windows 08 or 03 boots so much slower in the cloud than Linux. Any specific technical information regarding this would be greatly appreciated! Thanks.

    Read the article

  • MySQL is killing the server IO.

    - by OneOfOne
    I manage a fairly large/busy vBulletin forums (~2-3k requests per second, running on gigenet cloud), the database is ~ 10 GB (~9 milion posts, ~60 queries per second), lately MySQL have been grinding the disk like there's no tomorrow according to iotop and slowing the site. The last idea I can think of is using replication, but I'm not sure how much that would help and worried about database sync. I'm out of ideas, any tips on how to improve the situation would be highly appreciated. Specs : Debian Lenny 64bit ~12Ghz (6 cores) CPU, 7520gb RAM, 160gb disk. Kernel : 2.6.32-4-amd64 mysqld Ver 5.1.54-0.dotdeb.0 for debian-linux-gnu on x86_64 ((Debian)) Other software: vBulletin 3.8.4 memcached 1.2.2 PHP 5.3.5-0.dotdeb.0 (fpm-fcgi) (built: Jan 7 2011 00:07:27) lighttpd/1.4.28 (ssl) - a light and fast webserver PHP and vBulletin are configured to use memcached. MySQL Settings : [mysqld] key_buffer = 128M max_allowed_packet = 16M thread_cache_size = 8 myisam-recover = BACKUP max_connections = 1024 query_cache_limit = 2M query_cache_size = 128M expire_logs_days = 10 max_binlog_size = 100M key_buffer_size = 128M join_buffer_size = 8M tmp_table_size = 16M max_heap_table_size = 16M table_cache = 96

    Read the article

  • How should I host a site that could potentially get a short spike in traffic of 1000%+

    - by James Simpson
    This is a purely theoretical question, but what if I had a site that would normally only get a couple thousand hits a day, but for a few days each month that could shoot to several hundred thousand or even several million hits over the period of 1-3 days. The site would be pretty bare-bones (as in, 2-3 total pages with 1-2 max MySQL queries on each page and some PHP), so bandwidth wouldn't be the issue, but sheer volume taking down the site would be the main concern. Cloud hosting seems like the best way to go, but would something like Amazon EC2, MediaTemple, or something else be the right choice in this case?

    Read the article

  • UEC - Can the Cluster Controller and Storage Controller be seperate systems?

    - by Jeremy Hajek
    My department is implementing an Ubuntu Enterprise Cloud. I have done the testing and am quite comfortable with the 4 pieces, CC/SC, CLC, WS, NC. Looking at various documents below it appears the the Storage Controller and Cluster Controller (eucalyptus-sc and eucalyptus-cc) are always installed on the same system. My question is this: can I install the storage controller and the cluster controller on separate systems? http://open.eucalyptus.com/wiki/EucalyptusAdvanced_v2.0 the picture indicates that cc and sc are two different machines http://www.canonical.com/sites/default/files/active/Whitepaper-UbuntuEnterpriseCloudArchitecture-v1.pdf P.10 1st paragraph uses the word "machine(s)" http://software.intel.com/file/31966 P. 8 indicates the same separate architecture BUT... https://help.ubuntu.com/community/UEC/PackageInstallSeparate indicates below that the SC and CC are to be on the same system.

    Read the article

  • Automatic time tracking with central server, web reports

    - by user124209
    I need a software for automatic time tracking on Windows. With the following features: It should record time spent using the computer each day. Start time and end time. It should record what programs the employee used and total time for that program for specified period of time. It must have a centralized server that collects and stores all data. It could be a cloud server outside of a company network. It must have a web interface for viewing the monthly reports (the last but the most important requirement!). A nice feature to have would be an automatic generation of timesheets and Mac OS X support. I am looking to use it for a small team, this is not for personal use. Does anybody knows about software with these features?

    Read the article

  • How can I access my music anywhere?

    - by musicfreak
    So here's the idea: I use multiple computers on an almost daily basis. I would like to be able to access my entire music library from any of those computers through the Internet. Is there a service or perhaps some software that would allow me to host my music "in the cloud", or in some other way access it from a different computer? I've searched for something like this and the closest I've found is the Media Player application found in Opera Unite, but that requires my home computer to be turned on at all times, which is less than ideal. I'm willing to pay, but not so much that I could just rent a private server for a lower price. Thanks in advance.

    Read the article

  • What are the challenges when my enterprise desires to move the processing component of an applicatio

    - by Berkay
    Assume that i have an enterprise accounting application that consists of a front-end interface, a processing tier, and a back-end database. This is an application that contains private business data, and thus is traditionally run in a secure private network environment within the enterprise. What are the challenges that appear when my enterprise desires to move the processing component of this application to a cloud computing data center in order to achieve greater scalability or to reduce IT costs ? Pls note: do i have to make significant changes to my own infrastructure to enable external access to formerly private resources? do i have to modify the application code to handle new network topology ? thanks, if you give your answers in a simple manner, really appreciated.

    Read the article

  • Problem with Xen, xvda and sda

    - by Javier J. Salmeron Garcia
    I am creating a cloud for my university using Eucalyptus with Xen (PCs have Debian Squeeze 64 bit installed). I have a problem with the following guest configuration: # # Configuration file for the Xen instance evenmorefinalfoo, created # by xen-tools 4.2 on Thu May 26 11:03:06 2011. # # # Kernel + memory size # kernel = '/boot/vmlinuz-2.6.32-5-xen-amd64' ramdisk = '/boot/initrd.img-2.6.32-5-xen-amd64' vcpus = '1' memory = '128' # # Disk device(s). # root = '/dev/sda2 ro' disk = [ 'file:/home/xen/domains/evenmorefinalfoo/disk.img,sda2,w', 'file:/home/xen/domains/evenmorefinalfoo/swap.img,sda1,w', ] As you can see, the disk and swap images are meant to be mounted on sda1 and sda2. However, when I start the guest, these are mounted on xvda1 xvda2, provoking an error. Is there anything that I can do about that? It seems like it is a Xen error. Thank you in advance,

    Read the article

< Previous Page | 112 113 114 115 116 117 118 119 120 121 122 123  | Next Page >