Search Results

Search found 8367 results on 335 pages for 'temporal difference'.

Page 177/335 | < Previous Page | 173 174 175 176 177 178 179 180 181 182 183 184  | Next Page >

  • Possible to write an implement of RAIDZ or RAIDZ2 for the MD driver in the Linux kernel?

    - by Pharaun
    I am curious on if it is possible to have an implement of RAIDZ and/or RAIDZ2 in the MD driver in the Linux kernel? From my understanding of it is that the RAIDZ version is equivalent to a RAID 5, and that a RAIDZ2 is equivalent to a RAID 6. The main difference is that the stripe size can be variable for RAIDZ as opposite to RAID 5/6 from my understanding, which helps performance. So what I am wondering is would it be possible to add this performance enhancing technique to RAID 5 & 6 in the MD driver in the kernel? Or is it tied too closely to how the ZFS works?

    Read the article

  • How IBM Implement WebSphere Application Server SDK for Sun Solaris OS

    - by Eng Al-Rawabdeh
    I deploy the same application in IBM-WAS on different OS ( Windows , AIX and SUN-Solaris ) , SDK errors appeared on SDK for just Solaris OS , I refer some sites and it talk that the SDK on Solaris OS was build based on Sun SDK is it write ? so please I need to now if the IBM build the Solaris SDK from scratch or based on sun SDK ?? More Details : I Installed the same IBM WAS Application Server on two servers as the following : 1- Server1 - OS (AIX) 2- Server2 - OS ( Solaris) these two server on the same network and have the same configuration . Then I deploy Java Application ( X ) on both servers , the Application X was run on Server1 ( AIX ) without any problem but when I run the Application on Server 2 ( Solaris OS) I faced SDK issue . So I need to know what the difference between AIX WAS SDK and Solaris WAS SDK ?? Note : I try windows and it was run without any problem .

    Read the article

  • About the External Graphics Card and CPU usage

    - by Balaji
    We are Rendering 16 live Streams at our client machine through one of our applications and the resolution of the video streams are as 4CIF/MPEG4/25FPS/4000Kbits. The configuration of the client machine is below. HP Desktop Machine: Microsoft Windows XP Intel (R) Core2 Duo CPU E8400 @ 3.00 GHz 2.99 GHz, 1.94 GB of RAM Intel (R) Q45/Q43 Series Express Chipset (Inbuild) The CPU usage of the machine peaks 99% for 16 streams. After some discussion, we had decided to install external graphics card to reduce the CPU usage. So that, we have tried following graphics cards. NVIDIA Quadro NVS 440 - 128 MB Radeon HD 4350 - 512 MB GDDR2 Redeon HD 4350 - 1GB DDR2 ASUS EAH 4350 Silent 1GB DDR2 But the performance wise there has been no difference - even a drop in performance. So, what is the purpose of these external graphics cards? Really it will reduce the CPU usage? What parameters have to check, if we want to reduce the CPU usage?

    Read the article

  • phpredis + pconnect

    - by john smith
    I am using phpredis on my php based website. The webserver I am using is a the simplest apache apt-get installation, no configuration involved, as this is only a development environment. The issue I am facing is that basically, while using phpredis, there is no difference between "connect" and "pconnect" commands: they both create a new connection everytime, as I can see from the "info" command on redis-cli. Now, I am pretty sure it is because of the apache configuration and the fact that it probably (most likely) is a multi-threaded env, therefore can't enstablish a single connection. My question is basically for when I will turn into production: what would the best choice of a webserver be to avoid this problem? I remember using lighttpd with thousands of users and still get only a very few (like 2 or 3) connections on mongoDB. Any ideas? Thanks in advance.

    Read the article

  • SQL Server backup and restore process

    - by Nai
    Just wondering what backup processes you guys have. I am currently operating a weekly full database backup with daily differential backups. My understanding is that with such a set up, the difference between Full recovery mode and Simple recovery mode is that with Full recovery mode, I will be able to use the transaction logs to rollback my DB to a specific point in time having applied the latest differential backup. Assuming that in my scenario, the last differential backup serves as my last and ultimate 'save point', I don't see a need to rollback my DB even further back using the logs. This brings me to my question: Is there any additional benefits to be had using a Full recovery mode for my current backup process?

    Read the article

  • SharePoint Designer not syncing consistently

    - by normalocity
    I've got a user who uses SharePoint Designer to maintain an internal intranet site. When syncing (remote-to-local) it appears to work at first, but usually hangs about 2-3 minutes into the sync, when he's syncing it to a sub-folder of his "My Documents". In this case, his "My Documents" is stored on a network share/profile. When I do the same thing, it works for me. The difference? My "My Documents" folder is locally stored. In other words, he's syncing from the remote server, into a network share. I'm syncing from the remote server, into a local drive. Any idea why having the sync destination on a network share, vs. a local drive would cause this? When it locks up, we can navigate to his "My Documents" folder still - so I don't believe that we're loosing connection to his drive - unless perhaps the connection is intermittent, and SharePoint Designer isn't re-trying the sync.

    Read the article

  • Media player only works as administrator?

    - by Jeremy
    It seems I can only get Media Player 12 to work as administrator. If I run it normally (I am in the administrator group on my local PC) and right click on Music, and choose Manage Music Library. Media Player will sit and think for 5 or so seconds, then just not do anying, no dialog, no error. If I run as administator I can now get into the Manage Music Library dialog and add my a public folder containing my music. I've even tried granting everyone access to the public folder. One thing to note is that I have recently set up a domain controller and added my PC to the domain. With my local account I never noticed this problem, but I've since created a domain account and am now seeing this issue. I can't find much difference between the local and domain accounts - both are in the administrator group. Why would WMP require run as administrator? OS, Windows 7 64bit

    Read the article

  • IPtables rate-limit, What are the differences between modules? Recent, Limit

    - by TechZilla
    I am doing some rate-limiting with IPtables, and i'm not sure if I should use "Recent" or "Limit" What are the differences between the two? If they both achieve the same result, which one has better performance? I would like to know, regardless if any difference would be perceivable. I am looking to ACCEPT if under limit, and REJECT if over. I'm not interested in thus bandwidth throttling, I don't want a queue. I don't need any syntax examples, both have ample use examples online. I have also used Limit in the past. I appreciate any responses.

    Read the article

  • APC module causing strange error

    - by clifgriffin
    When I run php -v I get: PHP Warning: PHP Startup: Unable to load dynamic library '/usr/lib64/php/modules/apc.so' - /usr/lib64/php/modules/apc.so: undefined symbol: php_pcre_exec in Unknown on line 0 This isn't my first rodeo. I've setup APC multiple times. This is a MediaTemple Dedicated Virtual 4.0 with Plesk 11. Plesk 11 is the only thing essentially different from the other servers I've set this up on. I've verified that pcre-devel is installed. I've compiled APC from source as well as used pecl to install it. No difference. I also tried downgrading to APC 3.0.19, with no love.

    Read the article

  • Does SQLIO lie when run from a Hyper-V guest on a VHD?

    - by ScottStonehouse
    SQLIO seems like a useful tool. I thought it would be interesting to try to measure the speed difference between a physical disk and a VHD. So I ran SQLIO on the Hyper-V host on the physical drive. Results seemed reasonable. Then I ran it from the guest to test the vhd (on the same physical disk). I expected it to be a bit slower. But instead it was way faster - like 0ms average latency. So I'm trying to learn something here. It seems like hyper-v is fooling SQLIO somehow but I don't understand it well enough to figure it out. It's a dynamic vhd, no snapshots or anything, and the vhd is the only file on the disk. The physical disk is actually a two SAS drive RAID 1.

    Read the article

  • Is there a common X-Header for RFC 2821 "MAIL FROM"? Should it be DKIM signed?

    - by makerofthings7
    w.r.t. the difference between RFC2821 MAIL From and RFC2822 FROM I'm considering having my MTA add a header specifying what was sent in the MAIL FROM portion of the envelope. The RFC2821 header is used for receiving email bouncebacks, and is the header that is checked in SPF and some SenderID configurations. The goal is to make it easier for diagnostics and debugging by having this low level information in the email header. What is an acceptable name for this SMTP header? Should this header be signed by DKIM? Is there any reason why it shouldn't be signed?

    Read the article

  • Which OS should I chose for my VPS?

    - by Camran
    I am about to order a virtual private server now, and have no experience in any Linux OS whatsoever. I am a fast learner however... My VPS provider provides these OS: Ubuntu 8.04 LTS Ubuntu 8.04 LTS 64-bit Ubuntu 9.10 Ubuntu 9.10 64-bit Debian 5.0 Gentoo Gentoo 64-bit Ubuntu 8.04 LTS + Ruby on rails I don't know what these are, however I have heard about Ubuntu alot, and know there is alot of information about it on the Internet. Will it make any difference which one I chose? I plan on running a classifieds website, which uses PHP, MySql, Java (for Solr) and the usual standard stuff (HTML, javascript...). Which should I chose? And what is the next step after chosing one? Thanks

    Read the article

  • Using "mv" or "ditto" to merge folders in OS X

    - by Sandro Dzneladze
    Used to the Windows way of doing I just found out OS X has no merge function – moving means replacing the folder. While this does make sense, I miss merging! I've two Wordpress directories, 1 contains default source and 2 contains worked version with plugins custom theme etc. I want to see difference between this two, so I'm putting it on SVN. Folder 1 is already up, now in theory I should simply merge contents of 2 with 1 by replacing everything with contents of 2 but leaving hidden SVN files untouched. Unfortunately OS X, when moving, replaces the folder so that my SVN client goes crazy and doesn't understand folder structure anymore. So, I believe my options are mv and ditto, but which one would you use in my situation and how? sudo mv wordpress /Documents/svn/wwwholiday/trunk/wordpress I want mv to overwrite everything it finds, but leave alone whatever is already inside folder 1 and has no duplicate in folder 2.

    Read the article

  • Should this folder called Data be indexed?

    - by panny
    In the indexing options of Windows 7 there is a folder called Data which is excluded from indexing for the C:\ drive by default. Can someone confirm this, please? I was not able to locate that folder on my drive, nor include it in the search index. The difference in number of indexed files is unsatisfying: windows-7 native indexing service:377703 files on six drives; third party desktop search indexing service:698654 files on the same number of drives. Files in UA Control seem not being indexed without proper priviledges. How can this be circumvented?

    Read the article

  • Why using Acrobat 10 resaving a PDF file that was 4MB will become 3MB?

    - by Jian Lin
    I had some PDF files and just try to open it and do some highlighting using Acrobat 10 (also called Adode Reader X)... After highlighting, I save the file (using a different filename), and now the file change from 4MB to 3MB... is it just compression? Or making the images have lower clarity? (thought I cannot see any difference). What is the reason? If it is just compression, then why wasn't it done before, as winzip technology is quite mature more than even 10, 12 year ago.

    Read the article

  • trouble with internet connection - slow to open web pages if they open at all until I put VPN on and then they open ok

    - by Caroline Coleman
    I am having problems with my internet connection. At the moment I am on a mac and connected through a netgear wireless router. The internet connection either won't open a webpage at all or if it does it takes ages. However if I turn my VPN on the pages open at a normal speed. Also skype functions OK and I seem to be able to download files ok. I have tried connecting with a wire between the router and the computer and it makes no difference.

    Read the article

  • Few questions about SLI

    - by toomanyairmiles
    Hi all, thanks in advance for your help. I've just added a second card to my system so I can add a third monitor. I'd got as far as determining both cards need to use the same driver (after a blind alley with another cheap ATi card) so I'm now the proud owner of a second BFG 9800 GTX+ card. One is a BFG OCX and the other an BFG OC (small difference in clock speeds but they are in all other respects the same) but wanted to know the following:- 1) Is it worth adding the SLI connector, will it really boost overall performance (I'm guessing that the OCX card will then perform as the OC card does)? 2) Are SLI connectors (the one's that run across the top of the cards) motherboard or manufacturer specific? 3) If I do SLI the cards will I still be able to use all four monitor connectors or just the two on the master card? I'm not a gamer, I'm an IA and web designer so the system is mostly for Photoshop and Illustrator work and the occasional knock around in command and conquer.

    Read the article

  • PowerShell Remoting: No credentials are available in the security package

    - by TheSciz
    I'm trying to use the following script: $password = ConvertTo-SecureString "xxxx" -AsPlainText -Force $cred = New-Object System.Management.Automation.PSCredential("domain\Administrator", $password) $session = New-PSSession 192.168.xxx.xxx -Credential $cred Invoke-Command -Session $session -ScriptBlock { New-Cluster -Name "ClusterTest" -Node HOSTNAME } To remotely create a cluster (it's for testing purposes) on a Windows Server 2012 VM. I'm getting the following error: An error occurred while performing the operation. An error occurred while creating the cluster 'ClusterTest'. An error occurred creating cluster 'ClusterTest'. No credentials are available in the security package + CategoryInfo : NotSpecified: (:) [New-Cluster], ClusterCmdletException + FullyQualifiedErrorId : New-Cluster,Microsoft.FailoverClusters.PowerShell.NewClusterCommand All of my other remote commands (installing/making changes to DNS, DHCP, NPAS, GP, etc) work without an issue. Why is this one any different? The only difference is in the -ScriptBlock tag. Help!

    Read the article

  • How can I bulk rename files in a RAR or ZIP archive on the mac?

    - by Chris R
    I have a set of archive files -- both zip and rar formats -- inside of which I need to rename some files. Specifically, I want to do something like this: for each archive file in a directory for each file in the archive if the file name matches the regular expression /(.* - [0-9]{2})([0-9]{2} - .)*/ rename the file as \1-\2 The trick isn't so much in the generation of the new name; I can do that with either bash or sed or anything else. It's the set of commands to manipulate the files in the archives using rar/unrar or unzip/zip (If it makes a difference, I'm re-formatting some CBR/CBZ files to get the double-page spreads to come up in the right order in SimpleComic -- it interprets page 0203 as page 203, which makes the story a bit hard to follow)

    Read the article

  • Can I change a MySQL table back and forth between InnoDB and MyISAM without any problems?

    - by Daniel Magliola
    I have a site with a decently big database, 3Gb in size, a couple of tables with a dozen million records. It's currently 100% on MyISAM, and I have the feeling that the server is going slower than it should because of too much locking, so I'd like to try going to InnoDB and see if that makes things better. However, I need to do that directly in production, because obviously without load this doesn't make any difference. However, I'm a bit worried about this, because InnoDB actually has potential to be slower, so the question is: If I convert all tables to InnoDB and it turns out i'm worse off than before, can I go back to MyISAM without losing anything? Can you think of any problems I might encounter? (For example, I know that InnoDB stores all data in ONE big file that only gets bigger, can this be a problem?) Thank you very much Daniel

    Read the article

  • New Linux Mint User Networking questions

    - by nyCecilia
    I have a readynas that I've been using with XP, Vista, and Win7. Because of weirdness with Vista, it is set up for full read/write guest access. Now I have a Linux Mint netbook. I have set up smb on it and can read from the readynas smb shares, but I can't write. What else can I check? Part2--(keep in mind my network knowledge is small...or smaller) what is the difference between NFS and SMB, can a readynas be set up to allow access to the SMB shares via NFS (if I can figure out NFS lol)? A link to a guide for beginners would be appreciated, google searching "Linux Mint Readynas" doesn't give me anything useful.

    Read the article

  • SQL Server tempdb size seems large, is this normal?

    - by Abe Miessler
    From what I understand the system database is used to hold temporary tables, intermediate results and other temporary information. On one of my database instances I have a tempdb that is seems very large (30GB). This database has not been modified (as in "last modified date" on the mdf file) in over a week. Is it normal to have the temp db remain that large for that long of a period? It seems to me that it should be updating fairly often and returning space that it is using fairly quickly... Am I way off here or is SQL Server doing something weird? FYI: This is a SharePoint 2010 database, not sure if that makes a difference.

    Read the article

  • HDMI Sound for HTPC

    - by Brent Arias
    I have the (perhaps) the same problem as stated in this other HDMI sound to HTPC question. I tried the advice of clicking on the speaker in the system tray. I can see the HDMI audio device I want to use. That device claims to be functioning properly. But there is no sound, and it won't let me select it as the active audio device. When I click on the troubleshooter, it says that there are no speakers connected. I would think this is because my computer us unable to pipe sound through the video card (preventing the HDMI from carrying it), except that it truly claims that it has an HDMI sound device that is working correctly. So I'm not sure what is wrong at this point. Thoughts? My system is Windows 7 x64. In case it makes a difference, the video card I'm using is this GeForce GTX 560

    Read the article

  • Web Folder size/quota reporting tool?

    - by nctrnl
    I am currently using a Visual Basic script to determine how big the web folders are and what quota is decided for each folder. The quota is in no way a physical limit, just a value inserted by me to decide whether a user is using too much space or not. The script does the job quite neatly and sends an html file by mail on a regular basis. The problem is that it's such a hassle to insert new quotas since I have to fiddle around with the code. A central "control panel" with an overview and ability to insert new quotas would be more suitable. Is there any software that can do the following: Scan specified folder/subfolders Report the file size and present it in some sort of interface (could be a php/mysql solution) Ability to specify a quota and see the difference value ? It is really important that the quota handling is made simple so that some non-technician can handle this.

    Read the article

  • Web Folder size/quota reporting tool?

    - by nctrnl
    I am currently using a Visual Basic script to determine how big the web folders are and what quota is decided for each folder. The quota is in no way a physical limit, just a value inserted by me to decide whether a user is using too much space or not. The script does the job quite neatly and sends an html file by mail on a regular basis. The problem is that it's such a hassle to insert new quotas since I have to fiddle around with the code. A central "control panel" with an overview and ability to insert new quotas would be more suitable. Is there any software that can do the following: Scan specified folder/subfolders Report the file size and present it in some sort of interface (could be a php/mysql solution) Ability to specify a quota and see the difference value ? It is really important that the quota handling is made simple so that some non-technician can handle this.

    Read the article

< Previous Page | 173 174 175 176 177 178 179 180 181 182 183 184  | Next Page >