Search Results

Search found 20904 results on 837 pages for 'disk performance'.

Page 496/837 | < Previous Page | 492 493 494 495 496 497 498 499 500 501 502 503  | Next Page >

  • Ubutnu 12.04 mdadm inactive

    - by user32274
    For a while now, my RAID 5 has ceased to work. Everytime I tried "madm --detail /dev/md127", its states all the drive and drive info, but that two of the drives have been removed. After some restarts, doing the same thing, i am getting /dev/md127 does not appear to be active. When I go into DiskUtil, I can see all 6 Hard Drives healthy and present, and i can see the Raid 5 at the bottom under Multi-disk Devices. However, the Raid says 0.0kb, and is not active. Please help and let me know how to proceed from here. I would really like to avoid rebuilding the RAID, especially because all 6 drives seem to be healthy and present. Thanks so much.

    Read the article

  • OpenSuse 12.2 sort en version finale : plus stable et performant avec Linux 3.4, KDE 4.8, LibreOffice 3.5

    OpenSuse 12.2 disponible : plus stable et performant avec Linux 3.4, KDE 4.8, LibreOffice 3.5 et plusieurs autres logiciels open sources La version stable de la mise à jour 12.2 de la distribution Linux OpenSuse est disponible en téléchargement. Attendu initialement pour le 11 juillet dernier, OpenSuse 12.2 accuse quelques mois de retard, lui permettant de vanter sa stabilité et son gain de performance. [IMG]http://rdonfack.developpez.com/images/opensuse.jpg[/IMG] Fièrement appuyé par le noyau Linux 3.4, qui lui fait don de sa couche de stockage plus rapide pour empêcher les blocages lors des transferts, OpenSuse embarque la bibliothèque de base glibc 2.15, ...

    Read the article

  • Greenfoot project is read-only

    - by AzharHafiz.com
    I received this message when starting greenfoot on ubuntu 11.10 I'm a newbie and not sure where does the file located (greenfoot) The project is read-only.How and where do I change the permission? You will not be able to create objects or execute methods. Either the access rights of the project directory are set as 'read only' for you, or the whole file system is not writable (is it a CD?). To fully use this project, you must ensure that it is on a writable file system (usually your hard disk), and that you have write permission in the project directory and each file within it. This can often be accomplished by choosing "save as" from the Project menu after closing this dialog.

    Read the article

  • Oracle E-Business Financials Recommended Patch Collections (RPCs) for R12.1.3 Have Been Released for August 2012

    - by Oracle_EBS
    What is a Recommended Patch Collection (RPC)? An RPC is a collection of recommended patches consolidated into a single, downloadable patch, ready to be applied. The RPCs are created with the following goals in mind: Stability: Address issues that occur often and interfere with the normal completion of crucial business processes, such as period close--as observed by Oracle Development and Global Customer Support. Root Cause Fixes: Deliver a root cause fix for data corruption issues that delay period close, normal transaction flow actions, performance, and other issues. Compact: While bundling a large number of important corrections, we have kept the file footprint as small as possible to facilitate uptake and minimize testing. Reliable: Reliable code with multiple customer downloads and comprehensive testing by QA, Support and Proactive Support. RPCs are available for the following products: Cash Management Collections E-Business Tax Financials for India Fixed Assets General Ledger Internet Expenses iReceivables Loans Payables Payments Receivables Subledger Accounting For the latest Financials Recommended Patch Collections (RPCs), please view: EBS: R12.1 Oracle Financials Recommended Patches [Doc ID 954704.1].

    Read the article

  • saving ubuntu settings to load into a new machine

    - by CodeKingPlusPlus
    I will soon be receiving a new machine (yes, I need the improved performance!) how can I transfer over all of my current Ubuntu settings and files to the new machine. This is very important because I have many packages installed from apt-get, python packages, emacs packages and so forth. I really want to avoid installing all of the same packages over again. I am currently running Ubuntu 12.04 LTS. What is the best solution in transferring my current ubuntu settings and files to a new machine? Let me know if you need more information. All help is greatly appreciated!

    Read the article

  • Can't open any of my drives/devices(not USB drive)

    - by Anontu
    I can't open any of my computer drives (:c/,:d/.....) Every time i tried to open the drives, the following notice appeared: (i'm new in Ask Ubuntu,so,i can't upload the snapshot,i need 10 reputations to do that) **Unable to access "__ Volume"** error mounting/dev/sda7at/media/MyPC/__Volume:Command-Line 'mount-t "ntfs"-o "uhelper=udisk2,nodev,nosuid,uid=1000,gid=1000,dmask=0077,fmask=0177""/ dev/sda7""/media/MyPC/__Volume"' exited with non-zero exit status 14:The disk contains an unclean file system (0,0).| Metadata Kept in Windows cache,refused to mount. Failed to mount '/dev/sda7':Operation not permitted The NTFS partition is in an unsafe state.Please,resume and shutdown Windows fully(no hibernation or fast restarting),or mount the volume read-only with the 'ro' mount option. I have Windows side by side Ubuntu 13.04 in my computer and i have done (like-Shutting down Windows properly...) things as these in the notice. But it's not working.

    Read the article

  • What do I need to do when installing old version of ubuntu on new arch platform?

    - by Blangero
    I got a board with aptio cpu which is said to be of new platform and may have driver problems while using old version of ubuntu, and it do have. After installing my customized Live ubuntu server with X(the system runs as a liveCD), X starts but failed to get right resolution. If I run ubiquity in the live system to install the system to my disk, I can't start X and the whole console was a mess.Tried connect from ssh and upgrade, failed to solve the problem. Install ubuntu 12.04.2 desktop, at bootup the text was a mess, X starts and still could not get the right resolution. Install ubuntu 14.04 server, at bootup the text was also a mess, but it turned clear and the resolution is right. So what else do I need to provide? CPUInfo ? lspci -nn | grep VGA ? anything else? And what's the problem? What can I do to support the newest arch in my customized system? Thanks

    Read the article

  • Alternative ways to construct maps

    - by sideways8
    I've searched around and it seems like most people are using tile-based map systems. I suppose this question is more theoretical than practical (I am not very concerned about memory or performance speed), but I want to know: what other ways can a map be created in a game? A map being a graphic representation of terrain that can be navigated, has entrances and exits, and boundares (no-go zones). Besides using text files to store and arrays to load tile data, one idea I had was to store a map entirely as a graphic file and use queries on the pixel colour to determine boundaries (ie, you can only move in a certain direction if the way is bright enough in that direction). What other creative map systems are out there?

    Read the article

  • Can't find my DVD drive in Ubuntu 12.04

    - by user72953
    I am completely new to Linux, due to some problems in my Windows I decided to give Linux a try. I am quite liking it accept for some minor issues. The biggest of them is that I cannot find my DVD drive in Ubuntu. After I insert the disk, the drive's green light flickers and then nothing. I don't get anything on my unity bar or desktop. There is no folder inside my /dev folder which indicates a dvd drive. There is a cdrom folder in my root folder but nothing is there. If you guys know what's wrong, I'd appreciate the help. Am a complete newbie so don't expect me to know anything about Ubuntu, although I have spent lots of hours in last 2 days googling my issue so I have basic information...

    Read the article

  • (12.04 vm/server) Dist-upgrade to 3.2.0-63 wants to remove git (1.9.2) and git-core - is that the correct behavior?

    - by YellowShark
    was wondering if anyone knows dist-upgrade wants to remove git. FWIW, this is a pretty simple box, mainly used for web dev. $ uname -a Linux precise64 3.2.0-61-generic #93-Ubuntu SMP Fri May 2 21:31:50 UTC 2014 x86_64 x86_64 x86_64 GNU/Linux $ git --version git version 1.9.2 $ sudo apt-get dist-upgrade Reading package lists... Done Building dependency tree Reading state information... Done Calculating upgrade... Done The following packages will be REMOVED: git git-core The following NEW packages will be installed: linux-headers-3.2.0-63 linux-headers-3.2.0-63-generic linux-image-3.2.0-63-generic The following packages will be upgraded: git-man linux-headers-server linux-image-server linux-server phpmyadmin 5 upgraded, 3 newly installed, 2 to remove and 0 not upgraded. Need to get 58.8 MB of archives. After this operation, 199 MB of additional disk space will be used. Do you want to continue [Y/n]?

    Read the article

  • Getting Started with 12.04 LTS Problems

    - by mark
    Okay previously I used ubuntu 10.04 in Desktops and Loved it! I bought a newer Toshiba Satellite with an i7 CPU, 8GB Ram, 1TB HD, first thing I tried to do was install Ubuntu 9 on it then I found out it was no longer Supported. (I gave my 10.04 disk away) So I tried installing Windows 7 and experienced SO MANY problems, I am going back to try 12.04. Starting all over again. After installing 12.04 Wireless was Detected yet said it was Disabled by a Hardware Switch. Went read through the Message Boards found Solution which was Sudo rf Kill All. Okay all excited I would finally get my Ubuntu to finally work I rushed back to my laptop from the internet cafe, and went looking for the TERMINAL input on my Ubuntu 12.04, and I cannot find the Terminal. If I can't find the TERMINAL how Can I enter any Sudo Stuff? So I guess the first question is Where is the Terminal Located in 12.04? Thank you.

    Read the article

  • What is the best way to work with large databases in Java depending on context?

    - by user19000
    We are trying to figure out the best practice for working with very large DBs in Java. What we do is a kind of BI (business Intelligence), i.e analyzing very large DBs, and using them to create intermediate DBs that represent intelligent knowledge of the DBs. We are currently using JDBC, and just preforming queries using a ResultSet. As more and more data is being created, we are wondering whether more appropriate ways exist for parsing and manipulating these large DBs: We need to support 'chunk' manipulation and not an entire DB at once(e.g. limit in JDBC, very poor performance) We do not need to be constantly connected since we are just pulling results and creating new tables of our own. We want to understand JDBC alternatives, with respect to advantages and disadvantages. Whether you think JDBC is the way to go or not, what are the best practices to go by depending on context (e.g. for large DBs queried in chunks)?

    Read the article

  • What would you choose for your project between .NET and Java at this point in time ?

    - by Basic
    You are just starting a new project and you have these two technologies to choose from, Java and .NET. The project you are working doesn't involve having features that would make it easy to choose between the two technologies (e.g. .NET has this that I need and Java does not) and both of them should work just fine for you (though you only need one of course). Take into account: Performance Tools available (even 3rd party tools) Cross platform compatibility Libraries (especially 3rd party libraries) Cost (Oracle seems to try and monetize Java) Development process (Easiest/Fastest) Also keep in mind that Linux is not your main platform but you would like to port your project to Linux/MacOs as well. You should definitely keep in mind the trouble that has been revolving around Oracle and the Java community and the limitations of Mono and Java as well. It would be much appreciated if people with experience in both can give an overview and their own subjective view about which they would choose and why.

    Read the article

  • Tweaking Hudson memory usage

    - by rovarghe
    Hudson 3.1 has some performance optimizations that greatly reduces its memory footprint. Prior to this Hudson used to always hold the entire data model (all jobs and all builds) in memory which affected scalability. Some installations configured heap sizes in excess of 1GB to counteract this. Hudson 3.1.x maintains an MRU cache and only loads jobs and builds as they are required. Because of the inability to change existing APIs and be backward compatible with plugins, there were limits to how far we could go with this approach. Memory optimizations almost always come with a related cost, in this case its additional I/O that has to be performed to load data on request. On a small site that has frequent traffic, this is usually not noticeable since the MRU cache will usually hold on to all the data. A large site with infrequent traffic might experience some delays when the first request hits the server after a long gap. If you have a large heap and are able to allocate more memory, the cache settings can be adjusted to take advantage of this and even go back to pre-3.1 behavior. All the cache settings can be passed as options to the JVM container (Tomcat or the default Jetty container) using the -D option. There are two caches, independant of each other, one for Jobs and the other for Builds. For the jobs cache: hudson.jobs.cache.evict_in_seconds ( default=60 ) Seconds from last access (could be because of a servlet request or a background cron thread) a job should be purged from the cache. Set this to 0 to never purge based on time. hudson.jobs.cache.initial_capacity ( default=1024 ) Initial number of jobs the cache can accomodate. Setting this to the number of jobs you typically display on your Hudson landing page or home page will speed up consecutive access to that page. If the default is too large you may consider downsizing and using that memory for the Builds cache instead. hudson.jobs.cache.max_entries ( default=1024) Maximum number of jobs in the cache. The default is large enough for most installations, but if you find I/O activity when always accessing the hudson home page you might consider increasing this, but first verify if the I/O is caused by frequent eviction (see above), rather than by the cache not being large enough. For the builds cache: The builds cache is used to store Build objects as they are read from storage. Typically this happens when a user drills down into the details of a particular Job from the hudson hom epage. The cache is shared among builds for different jobs since in most installations all jobs are not accessed with the same frequency, so a per-job builds cache would be a waste of memory. hudson.job.builds.cache.evict_in_seconds ( default=60 ) Same as the equivalent Job cache, applied to Build. hudson.job.builds.cache.initial_capacity" ( default=512 ) Same as equivalent Job cache setting. Note the smaller initial size. If your site stores a large number of builds and has frequent access to more builds you might consider bumping this up. hudson.job.builds.cache.max_entries ( default=10240 ) The default max is large enough for most installations, the builds cache has bigger sized objects, so be careful about increasing the upper limit on this. See section on monitoring below. Sample usage: java -jar hudson-war-3.1.2-SNAPSHOT.war -Dhudson.jobs.cache.evict_in_seconds=300 \ -Dhudson.job.builds.cache.evict_in_seconds=300 Monitoring cache usage The 'jmap' tool that comes with the JDK can be used to monitor cache performance in an indirect way by looking at the number of Job and Build objects in each cache. Find the PID of the hudson instance and run $ jmap -histo:live <pid | grep 'hudson.model.*Lazy.*Key$' Here's a sample output: num #instances #bytes class name 523: 28 896 hudson.model.RunMap$LazyRunValue$Key 1200: 3 96 hudson.model.LazyTopLevelItem$Key These are the keys to the Jobs (LazyTopLevelItem$Key) and Builds (RunMap$LazyRunValue$Key) in the caches, so counting the number of keys is a good indicator of the number of items in the cache at any given moment. The size in bytes can be ignored, they are just the size of the keys, not the actual sizes of the objects they hold. Those sizes can only be obtained with a profiler. With the output above we can conclude that there are 3 jobs and 28 builds in memory. The 28 builds can all be from 1 job or all 3 jobs. Over time on an idle system, these should get evicted and memory cache should be empty. In practice, because of background cron threads and triggers, jobs rarely fall down to zero. Access of a job or a build by a cron thread resets the eviction timer.

    Read the article

  • PCI function number for SATA AHCI controller

    - by Look Alterno
    I'm debugging a second stage boot loader for a PC with SATA AHCI controller. I'm able to enumerate the PCI bus and find the hard disk. So far, so good. Now, lspci in my notebook (Dell Inspiron 1525) show me: -[0000:00]-+-1f.0 Intel Corporation 82801HEM (ICH8M) LPC Interface Controller +-1f.1 Intel Corporation 82801HBM/HEM (ICH8M/ICH8M-E) IDE Controller +-1f.2 Intel Corporation 82801HBM/HEM (ICH8M/ICH8M-E) SATA AHCI Controller \-1f.3 Intel Corporation 82801H (ICH8 Family) SMBus Controller My question: Is SATA AHCI Controller always function 2 in any PC? If not, how I found? I don't pretend to be general; booting my notebook will be good enough, without compromise further refinements.

    Read the article

  • Ubuntu 12.04 LTS Purple Screen Error. FIX

    - by user100918
    When I boot up my Ubuntu, it hangs on the purple screen. All I can do is press Shift on start up and it gives me these 3 options. -Normal Boot -Perform Disk Scan -Restore Factory Settings I can also either press E or C. C for for GRUB command line. E for GNU GRUB and says these things setparams 'Restore Factory Settings' set gfxpayload=text insmod part_msdos insmod ext2 set root='(hd0,nsdos1)' linux /boot/vmlinuz-3.3.8-24-generic root=LABEL=SYSTEM ro acpi_osi_=Linux acpi_backlight=\ vendor quiet aufs=restore initrd /boot/initrd.img=3.3.8-24-aufs What is the problem and how do

    Read the article

  • Neue Marketing Kits für Hardware

    - by A&C Redaktion
    Die Oracle Marketing-Kit sind ein beliebtes Instrument zur Vertriebsunterstützung. Stetig erweitert enthalten sie den Textentwurf für Emailing, Landigpad und ein Telemarketing-Script. Jetzt sind brandneue Kits u.a. in Deutsch für folgende Hardware-Lösungen verfügbar: Server & Storage: Improve Database Capacity Management with Oracle Storage and Hybrid Columnar Compression Server & Storage: Accelerating Database Test & Development with Sun ZFS Storage Appliance Server & Storage: Upgrade SAN Storage to Oracle Pillar Axiom Server & Storage: SPARC Refresh with Oracle Solaris Operating System Server & Storage: SPARC Server Refresh: The Next Level of Datacenter Performance with Oracle’s New SPARC Servers Server & Storage: Oracle Server Virtualization Server & Storage: Oracle Desktop Virtualization

    Read the article

  • Database Insider - October 2012 issue

    - by Javier Puerta
    The October issue of the Database Insider newsletter is now available. (Full newsletter here) NEWS   Newly Launched Oracle Exadata X3 Redefines Extreme Performance At Oracle OpenWorld 2012, Oracle announced the general availability of Oracle Exadata Database Machine X3, a complete package of servers, storage, networking, and software that is massively scalable, secure, and fully redundant—and ideally suited for the varied and unpredictable workloads of cloud computing. Read More WEBCASTS What Are Oracle Users Doing to Improve Availability and Disaster Recovery? The Independent Oracle Users Group (IOUG) surveyed more than 350 data managers and professionals regarding planned and unplanned downtime, database high availability, and disaster recovery solutions. Download the report and watch the Webcast today.

    Read the article

  • cVidya’s MoneyMap Achieves Oracle Exadata Optimized Status

    - by Javier Puerta
    cVidya's MoneyMap running on Oracle Exadata provides extreme performance, including 4x-16x improvement in high data load rates, 4x faster data transformation and reconciliation, and query speeds - from a 2.5 billion record index –  improved from hours to few seconds! The MoneyMap solution enables operators to reconcile information from all network, operations and business support systems and through an on-going automated process, it detects problem areas which impact profitability as a result of revenue leakage, data inconsistencies or resources that are not being used efficiently. Once detected, MoneyMap provides tools to promptly correct and manage the problems to achieve profit maximization Learn more here.

    Read the article

  • How to Determine VPS Hosting Resources Needs for my upcoming Wordpress blog? How much resources should i purchase? [closed]

    - by Ishwar dixit
    Possible Duplicate: How to find web hosting that meets my requirements? Decided to purchase VPS hosting but Getting confused on amount of Resources i need? Wordpress will be used as platform, The blog i want to setup is assumed to have a traffic between 20k - 25k Visits per day with a rate of 5 pageviews per visit... there is No Download Facility provided...the content of the blog will be Text, Images & videos (will be used rarely)... The main question is? For the above requirement: How much RAM will be enough? How much CPU usage i will need? How much Bandwidth will be enough? How much Disk Space? Any other Requirement? Thanx in Advance..

    Read the article

  • Windows 7 and Ubuntu Boot issue

    - by user115137
    I had the idea to dual boot Win 7 and Ubuntu and what I did was the following: Made a clean install of win 7 using all of my hard drive, next I used the Ubuntu live cd and gparted to partition my drive to be the following: /dev/sda1 ext4 20GB (Linux root) /dev/sda2 ntfs 100GB(Win7) /dev/sda3 ext4 350GB(Home) /dev/sda4 extended 4GB(swap) The thing is, when installing ubuntu I deleted the partition win 7 creates for its boot sector and recovery and then resized the drive to look like what I mentioned, and Ubuntu installed GRUB to the MBR. When GRUB boots I can see Ubuntu but not Windows, how can I chainload it? Or should I fix the windows mbr with the windows 7 installation disk and try to set the dual boot from there? I don't really care which one of the 2 bootloaders I end up using, I just want the dual boot to work out. Thanks

    Read the article

  • install 64 bit as dual boot with 32 bit

    - by profgeorgefhart
    I downloaded Ubuntu 12.10 - bit .iso onto a bootable USB and want to put it on to my 32 bit 12.10 system as a dual boot, using a totally separate 1 Tb disk [which already has data on it]. However, when I go to restart and hit F8 fir BIOS set-up, I can change all the settings except the boot search order. Is there some other way to force the startup to read the USB first? Is the fact that I cannot change the search order to do with the fact that the 32 bit OS was 'Installed in legacy mode' not 'EFI? How can I rectify this? (This might be related -- might I need to update my BIOS for my Intel DX38BT motherboard? I think their last upgrade was 2009.)

    Read the article

  • Alt text vs CSS sprites (SEO vs speed)

    - by leeoniya
    I'm reworking our site to reduce HTTP requests and blocking requests by concatenating JS, css, gzipping, loading all JS via LABjs and using CSS sprites for images that were loaded individually via <img> tags before. Progress has been great so far - 5x page load performance improvement. However, we're in the top 5 organic search ranking in google for many targeted keywords and phrases. I'm afraid eliminating so many img tags with alt attributes can hurt our SEO. Does anyone have any experience with alt tag manip/removal and effects on SEO positions? Is previous rank "sticky"?

    Read the article

  • Intersection of player and mesh

    - by Will
    I have a 3D scene, and a player that can move about in it. In a time-step the player can move from point A to point B. The player should follow the terrain height but slow going up cliffs and then fall back, or stop when jumping and hitting a wall and so on. In my first prototype I determine the Y at the player's centre's X,Z by intersecting a ray with every triangle in the scene. I am not checking their path, but rather just sampling their end-point for each tick. Despite this being Javascript, it works acceptably performance-wise. However, because I am modeling the player as a single point, the player can position themselves so that they are half-in a cliff face and so on. I need to model them as as a solid e.g. some cluster of spheres or a even their fuller mesh. I am also concerned that if they were moving faster they might miss the test altogether. How should I solve this?

    Read the article

  • Cloud proxying service

    - by ChristopherJ
    I have an app that mashes up images from Bing image search, it's hosted on Heroku written in rails. The app is client side in javascript, so the mashup is done on an html5 canvas - this means though that if I fetch the images direct from the Bing server, the canvas gets dirty and I can't save it. As a quick work around, i have set up a route on my rails app that simply proxies the request to Bing and passes the result back through. Obviously this is a very poor performance solution and will eat up my dynos very quickly. Can anyone suggest a more suitable option? At the moment I'm thinking maybe Amazon EC2 with apache mod_rewrite rules would be better performing and more cost effective. Is there a cloud service (or an app I could deploy to a cloud service) that would be more appropriate for proxying requests for me so that my javascript can fetch the images without dirtying the canvas?

    Read the article

< Previous Page | 492 493 494 495 496 497 498 499 500 501 502 503  | Next Page >