Search Results

Search found 16329 results on 654 pages for 'b long'.

Page 281/654 | < Previous Page | 277 278 279 280 281 282 283 284 285 286 287 288  | Next Page >

  • SQL Server Performance & Latching

    - by Colin
    I have a SQL server 2000 instance which runs several concurrent select statements on a group of 4 or 5 tables. Often the performance of the server during these queries becomes extremely diminished. The querys can take up to 10x as long as other runs of the same query, and it gets to the point where simple operations like getting the table list in object explorer or running sp_who can take several minutes. I've done my best to identify the cause of these issues, and the only performance metric which I've found to be off base is Average Latch Wait time. I've read that over 1 second wait time is bad, and mine ranges anywhere from 20 to 75 seconds under heavy use. So my question is, what could be the issue? Shouldn't SQL be able to handle multiple selects on a single table without losing so much performance? Can anyone suggest somewhere to go from here to investigate this problem? Thanks for the help.

    Read the article

  • Ubuntu 12.04.1 Update Manager and synaptic Package Manager not working

    - by Ashoke
    Recently installed Ubuntu 12.04.1, 64bit on Intel Quad core system. There is a Red Circle with a WHITE 'minus' sign on the upper right hand corner. In the end of a long message displayed below the RED CIRCLE in 'grey' says that INSTALLED PACKAGES HAVE UNMET DEPENDENCIES. None of the following is working. 1. Ubuntu Software Center not opening. UPDATE MANAGER Could not initialize the package information An unresolvable problem occurred while initializing the package information. Please report this bug against the 'update-manager' package and include the following error message: 'E:Encountered a section with no Package: header, E:Problem with MergeList /var/lib/apt/lists/in.archive.ubuntu.com_ubuntu_dists_precise-updates_multiverse_binary-i386_Packages, E:The package lists or status file could not be parsed or opened.' SYNAPTIC PACKAGE MANAGER AN ERROR OCCURED The following details are provided E: Encountered a section with no Package: header E: Problem with MergeList /var/lib/apt/lists/in.archive.ubuntu.com_ubuntu_dists_precise-updates_multiverse_binary-i386_Packages E: The package lists or status file could not be parsed or opened. E: _cache-open() failed, please report. Otherwise, the computer is working fine with broadband WiFi internet. InstallPLEASE HELP.

    Read the article

  • Strategy for managing lots of pictures for a website

    - by Nate
    I'm starting a new website that will (hopefully) have a lot of user generated pictures. I'm trying to figure out the best way to store and serve these pictures. The CMS I'm using (umbraco) has a media library that puts a folder on the server for each image. Inside of there you can have different sizes of that same image. That folder has an ID on it and the database has additional information for that image along with the ID of the folder. This works great for small sites, but what if the pictures get up to 10,000, 100,000 or 1,000,000? It seems like the lookup on the directory would take a long time to find the correct folder. I'm on windows 2008 if that makes a difference. I'm not so worried about load. I can load balance my server pretty easily and replicate the images across the servers. The nature of the site won't have a lot of users on it either, but it could have a lot of pics. Thanks. -Nate EDIT After some thought I think I'm going to create a directory for each user under a root image folder then have user's pictures under that. I would be pretty stoked if I had even 5,000 users, so that shouldn't be too bad of a linear lookup. If it does get slow I will break it down into folders like /media/a/adam/image123.png. If it ever gets really big I will expand the above method to build a bigger tree. That would take a LOT of content though.

    Read the article

  • ngen.exe is constantly using CPU

    - by teedyay
    I recently installed Windows 7. This was a clean install (i.e. not an upgrade from another version of Windows), but I did install a bunch of other programs. All mainstream applications - nothing wacky. Since then, my CPU usage has been constantly at around 50%. Task Manager shows me that ngen.exe is the culprit. It's not a long-running task: I can see that it gets a new PID at least once a second, so I guess something is constantly triggering it. It does it all the time, even when I have no applications running. Has anyone else seen this? How do I find out what's causing this?

    Read the article

  • how to compile VTK on Mac Air

    - by Ilan Tal
    I am trying to compile VTK on a Mac after successfully doing so on Linux. I am using CMake 2.8.9 and for a long time I had a problem where it couldn't find jni.h. If anyone else has the problem, the answer is to have JAVA_INCLUDE_PATH2 point to "-framework JavaVM". After that it successfully finds JNI and does a "Configure" followed by "Generate". According to the instructions I have the next step should be easy: in a terminal run "make -j 2". I get a message "-bash: make: command not found". Strange.... I am in the vtkBuild directory, which seems the most reasonable place. I even tried the parent directory where I have the VTK source but that didn't work either. I ran a make under Linux and had no troubles, so what do I need to do with the Mac? Thanks, Ilan

    Read the article

  • How to fix bluescreen in windows 7 with multi-boot?

    - by Ismail Sensei
    I have HP laptop 6730S with two Operating systems : Windows 7 Ultimate 32bit Centos 6.4 64Bit The GRUB2 is not installed in MBR, use Windows' bootloader. After I choose Windows in the start, the blue-screen appears with unmountable_boot_volume problem so I tried some help from similar questions here ( use Command Prompt and enter the following command: chkdsk /R C: ). But the problem is, I can't get repair my computer it took so long and nothing happened after I waited more than 2H and when I put my Windows 7 DVD to boot it charge the files then same thing happened nothing show up so I couldn't use command prompt. But when I use Centos everything works just fine the D partition i can mounted normally but C partition it shows me error and tell me to go to windows and repair it with Chkdsk command and here is where I am stuck.

    Read the article

  • What is the best free software to learn touch-typing?

    - by gojira
    What is the best free software to learn touch-typing? Features it needs to have: should NOT display the keyboard layout on the screen! should give detailed statistics which actually measure progress (for which key do I have the highest error rate, graphs showing how typing speed improved over time, etc) should enable me to actually learn touch-typing in about one long weekend were I don't do much else than learning to touch-type. it would be very good if it were possible to load a text-file and the program will use words from the text-file for the exercises as well My goals are: at least same typing speed as I have now but which touch-typing, want to be able to look at the screen only when typing P.S.: EDIT: I forgot to mention, I'm using Win 7. And I know what the Dvorak and Colemak keyboard layouts are, but I'm not interested in them. My question was with respect to standard US keyboard layout.

    Read the article

  • How to open ports on router for better torrent performance

    - by Mehper C. Palavuzlar
    I've been using utorrent to download and upload torrents for a long time. Recently someone told me that I need to open port(s) for utorrent from my router settings for better downloading and uploading performance. Is it true? If yes, how can I do that? My utorrent version: 2.0 and the port used for incoming connections: 61829. My router: Yaksu S200 ADSL router modem and I can reach its settings via web interface. I looked at the settings but they seem a bit complicated to me. Other info you may need to know: I have dynamic IP. I'm using Win7 x64.

    Read the article

  • Multiple apps in one "icon" on the iPod touch [closed]

    - by Jerry
    I have researched but can't find any discussion about moving several apps into one "icon" on the iPod touch screen. I have moved 5 apps (all the same category) into one what is normally one app icon. The title on the icon reads "games" - all are games. I have all the apps jiggling and drag one game app on top of another game app - They move to be side by side and the title automatically reads "games" - is this "OK" to do - you can have nine apps in each of the 16 spaces available on each screen. Will this hurt the touch? As long as you have space (GBs) is this ok? Has anyone done or heard of this? Any help is appriecated.

    Read the article

  • Is a cluster the most cost effective redundancy method for windows server 2003?

    - by Ryan
    We had a server with bad ram which caused a long outage while they figured it out and our client facing apps had to go down for a while. We are coming up with a solution for instant fail-over but are not sure what the most cost effective method would be. Is a windows server cluster the best method for this? Also note we are using Parallels Virtuozzo if that makes any difference here. We found Parallels has a documented method for setting this up but it said it required a Domain Controller as well as a Fiber connection to shared storage, is all that really needed? Thanks.

    Read the article

  • How do I handle having too many links on a webpage because of my menu

    - by RandomBen
    I am developing a website that has a drop-down menu at the top of it. The Menu has around 100 links in it that are repeated on every page. Every page also has some number of links below the Menu that may or may not be in the menu itself. My issue is that Google says they generally don't like pages with more than 100 links on them. Is there any way to change the links on the menu so that they no longer "count" towards my max of 100 links? It seems like there should be an easy way to do this but their really doesn't seem to be. the rel=nofollow still counts towards the number of links on the page at least according to Google, so what other options do I have? I looked into where the 100 comes from and I found that it used to be here: http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=35769#2 but that is no longer the case. I found a more definitive and frankly muddier answer here: http://www.seomoz.org/blog/questions-answers-with-googles-spam-guru from Matt Cutts from 2007. Long story short, in 2007 they still felt 100 links was a good number but they stated you could go far beyond that. In fact, they said that pages with high PageRank could have 2-300. It did sound like having many links could reduce the PageRank of the page with all of the links or possibly all of the items linked to. Also, I know IIS7's SEO 1.0 toolkit suggests that pages should have no more than 250 links.

    Read the article

  • Visual Studio 2010 Is Here!

    - by Bill Evjen
    I think back to the days of the first versions of Visual Studio (when it was called Visual Studio .NET, remember?) and I think about how far Microsoft has come with this IDE. It really is the best IDE on the market. There is so much to this IDE it is amazing. It now can really handle managing your complete software application development lifecycle. For me, it is (besides Windows 7) the best and most successful product Microsoft has developed. You can obviously get this now and it is available on MSDN and some other places: MSDN Visual Studio Trial Editions Visual Studio 2010 Express Editions (free) You will also find great info at the Visual Studio Developer Center. Some other interesting tidbits of info: JetBrain’s ReSharper 5.0 has been released for VS2010 Oracle will have the new Oracle Dev Tools for VS2010 within one month - http://bit.ly/9gC9NE Visual Studio 64-bit - Why there is no 64-bit version of VS - http://bit.ly/dhhwAj In installing this version of Visual Studio, if you have been working on the previous RC builds, then you are going to want to uninstall these previous editions of the 2010 product. You can do this through the Add Remove Programs dialog and you are going to want to select the appropriate item from the long list of Visual Studio items. You are then going to want to step through the Visual Studio dialog (it will seem as if you are installing it again) – and you will then come to a point where you can select the option to Uninstall the entire application. If you have installed the Silverlight 4 RC stuff, then you are also going to want to uninstall this and you are also going to want to uninstall the “Update for Visual Studio 2010 (KB976272)” before installing Silverlight RC2 – which you can find on www.silverlight.net. Technorati Tags: vs2010,.net,visualstudio,microsoft

    Read the article

  • Wireless shows up as disabled, how can I get it working?

    - by Lazer
    $ sudo iwconfig lo no wireless extensions. eth0 no wireless extensions. wlan0 IEEE 802.11bg ESSID:off/any Mode:Managed Access Point: Not-Associated Tx-Power=0 dBm Retry long limit:7 RTS thr:off Fragment thr:off Encryption key:off Power Management:off pan0 no wireless extensions. $ This is what pops up when I click the two computers icon What should I do to get Wifi working on this machine? $ sudo ifconfig wlan0 up SIOCSIFFLAGS: No such file or directory $ $ lspci | tail 00:1d.1 USB Controller: Intel Corporation 82801I (ICH9 Family) USB UHCI Controller #2 (rev 03) 00:1d.2 USB Controller: Intel Corporation 82801I (ICH9 Family) USB UHCI Controller #3 (rev 03) 00:1d.7 USB Controller: Intel Corporation 82801I (ICH9 Family) USB2 EHCI Controller #1 (rev 03) 00:1e.0 PCI bridge: Intel Corporation 82801 Mobile PCI Bridge (rev 93) 00:1f.0 ISA bridge: Intel Corporation ICH9M LPC Interface Controller (rev 03) 00:1f.2 SATA controller: Intel Corporation ICH9M/M-E SATA AHCI Controller (rev 03) 00:1f.3 SMBus: Intel Corporation 82801I (ICH9 Family) SMBus Controller (rev 03) 01:00.0 VGA compatible controller: ATI Technologies Inc M92 LP [Mobility Radeon HD 4300 Series] 09:00.0 Ethernet controller: Marvell Technology Group Ltd. 88E8040 PCI-E Fast Ethernet Controller (rev 13) 0c:00.0 Network controller: Broadcom Corporation BCM4312 802.11b/g (rev 01) $

    Read the article

  • Anyone know where I can download a copy of Sun Java System Active Server Pages 4.0.3 for Solaris

    - by ewengcameron
    I've contacted Sun regarding this and they have told me that the download is no longer available as Active Server Pages 4.0.3 is now End Of Life. We need to upgrade our server to 4.0.3 to acheive PCI-DSS compliance. Anyone know of a site where I can download older copies of Sun files? Sun offer 4.0.1 and 4.0.2 to download but not 4.0.3 which is going to cause problems come October when Visa stops accepting transactions from non PCI compliant servers. If Sun kept their naming system consistent across versions, the file would be called "sjsasp403-sol-sparc.tar". I know the real solution is to upgrade every site on the server to use a different server language, i.e. PHP, and in the long term, this is our goal but we have over 100 sites requiring upgrading and its not a viable solution to get this done before October.

    Read the article

  • A story from SQLvdb and Idera

    - by Peter Larsson
    A year or so back, I struggled with some consistency problems so I figured out I needed a way to "mount" backup files as a virtual database. At the time (SQL Server 2005 and SQL Server 2008) my choice fell on Idera's SQLvdb because it felt easy enough to use. I used it a few times and it worked great. Some time later we upgraded to SQL Server 2008R2 and I didn't use SQLvbd for a long time. Until yesterday... I was upset that suddenly SQLvbd took more than 2 hours to mount the backup file (if it succeeded at all). I uninstalled the application and went for lunch. After lunch, I decided to give SQLvbd another chance so I emailed their tech support and got a response within 30 minutes or so. Now, since I am a SQL Server MVP, they gave me another serial number than my first and I downloaded and installed a newer version. But also this version was really slow. I emailed back to them with the additional information they requested and to my surprise I had got an email this morning when I came back to work, where Idera explained some of the issues (bugs) and asked my to test a newer version. I did, and now a fresh mount of a 100GB database (compressed to 20GB with native compression) located on our SAN takes less than 6 minutes! Thank you. //Peter

    Read the article

  • How many cron jobs are too many?

    - by guitar-
    I have a couple of cron jobs for basic maintenance which aren't very resource-intensive. I also have custom task scheduling (which is just calling a .php file and passing information via GET, ie: cronjob.php?param1=param ...). These can add up pretty quickly. These just call system commands and run external programs (Nmap is one of them). They usually don't take long either. Anyway, can anyone tell me, roughly what point is too many? I know it's hard to say since it depends on what job is being run and how often, but at what point does the crontab program start "struggling"? Anyone have any idea? Thanks.

    Read the article

  • We Found 100 Manufacturing Heros That Focus on Innovation!

    - by Stephen Slade
    There’s a good piece written by Ann Grackin of ChainLink Research on the Manufacturing Leadership 100 Awards program held recently in Palm Beach Fl, Apr 30-May 3, 2012.  This article (link below) highlights the summary of the Summit with specific focus on manufacturing innovation.  There were several informative keynotes and sessions from industrial leaders who are leveraging the latest tools and technologies to make better decisions. Ann writes that she was a panelist with Cindy Reese, SVP, Worldwide Operations, Oracle; and Steven Tungate, VP/GM, Supply Chain & Innovation, Toshiba America Business Solutions about Factories and Supply Networks of the Future. Ann writes “So what are these manufacturers doing? Significant rationalization of the supply base (Toshiba, especially, has this issue since they have a long history of many acquisitions), streamlining production to increase productivity, and looking for lower-cost countries for manufacturing….  No doubt firms have global customer bases, so they need to be present in these markets. However, a low-cost-country manufacturing source does introduce more risk in the supply chain. And that was discussed. Quality, security, and intellectual property protection were the critical global manufacturing issues also discussed. “Cindy (Reese) told a fascinating story about Oracle’s acquisition of Sun and the supply chain that was subsequently created. Here was one of the key points: Although Oracle sells on a global basis, they now do their own factory-installed software. This keeps potential ‘factory-installed malware’ from getting into the servers at contract manufacturers, and prevents pirated software. In this way, Oracle ensures that they deliver the quality and security people expect”. Learn more about the Manufacturing Leadership 100 program from Manufacturing Executive at: http://www.mlsummit.com/ Full Article Link:  http://www.clresearch.com/research/detail.cfm?guid=52327213-3048-79ED-99D4-E433DA64D4F0

    Read the article

  • How can I synchronize Google contacts and calendar with Palm Centro and Outlook 2000?

    - by meomaxy
    Seeking a free solution to synchronize my calendar and contacts between my Palm Centro and Google contacts and calendar. I currently sync my Palm contacts and calendar with Outlook 2000 on Windows XP. Google Calendar Sync does not support Outlook 2000, and I would rather not pay to upgrade Outlook just for this, as I don't even use Outlook much anymore. I wouldn't mind migrating my calendar and contacts off of it, as long as I can still sync with the Palm. I don't have an unlimited data plan for my Centro, so I can't use a solution that wirelessly syncs directly to Google. I have set up GCALDaemon to sync up my Google calendar with multiple machines running Rainlendar, but I have not yet figured out a way to synchronize that data with the Palm.

    Read the article

  • GIS-based data visualization and maintenance tool

    - by Dave Jarvis
    Background Looking to leverage an existing GIS system for exploring organizational data. Architecture The following figure represents a high-level overview of the system's desired features: The most basic usage would be as follows: The user visits a web site. The system presents a map (having regions, cities, and buildings). The user drills-down on the map to a particular building. The system provides a basic CRUD interface. The user can view and modify information about personnel (e.g., their assigned teams), equipment (e.g., network appliances), applications, and the building itself (e.g., contact and phone numbers). Ideally, all the components should be open-source (or otherwise free). Problem This must be a small project that needs a quick (but functional) prototype, mostly to confirm whether or not such a system would be useful in the long term. Questions What software components would you use to quickly develop a working prototype? What open-source solutions already exist, if any? Ideas Here is what I am thinking: PostGIS - Define the regions, cities, and sites Google Maps - Display an interactive, clickable map geoJSON - Protocol between PostGIS and Google Maps Seam - CRUD interface Custom Development For example, this would entail: Installation and configuration Configure SSH for remote logins Subversion (or git) PostgreSQL PostGIS Java Tomcat Seam JasperReports Enter GIS information into PostGIS Aggregate data sources into PostgreSQL database Develop starting page for map interface Develop clickable Google Maps interface Develop summary reports Develop CRUD interface using Seam for data maintenance Surely something like this already exists? Thank you!

    Read the article

  • How to refactor while keeping accuracy and redundancy?

    - by jluzwick
    Before I ask this question I will preface it with our environment. My team consists of 3 programmers working on multiple different projects. Due to this, our use of testing is mostly limited to very general black box testing. Take the following assumptions also: Unit Tests will eventually be written but I'm under strict orders to refactor first Ignore common Test-Driven Development techniques when given this environment, my time is limited. I understand that if this were done correctly, our team would actually save money in the long-term by building Unit-Tests before hand. I'm about to refactor a fairly large portion of the code that is critical. While I believe my code will accurately work when done and after our black box testing, I realize that there will be new data that the new code might not be able to handle. What I wanted to know is how to keep old code that functions 98% of the time so that we can call those subroutines in case the new code doesn't work properly. Right now I'm thinking of separating the old code in a separate class file and adding a variable to our config that will tell the program which code to use. Is there a better way to handle this? NOTE: We do use revision control and we have archived builds so the client could always revert to a previous build, but I would like to see if there is a decent way of doing this besides reverting. I want this so they can use the other new functionality delivered in the new build. Edit: While I agree I will need to write Unit Tests for this, I don't believe I will capture everything with them. I'm looking for ways to easily be able to revert to the old, functional code should anything happen. While I know this is a poor practice, I'm planning on removing this code after our team can guarantee that the new code works to the same standards as the old.

    Read the article

  • Sorry. Not Much Happened Today!

    - by steve.diamond
    And THAT blog headline is dedicated to Seth Godin, who recently wrote that unlike its print brethren, digital media outlets aren't burdened with having to make their articles long enough to match the number of surrounding ad pages. He states that just because you CAN write more doesn't mean you SHOULD. Well, you don't have to tell me that twice. So to continue my rambling entry today, I'd suggest you read this post by Donal Daly on 10 steps to intelligent Social CRM for Sales. No seriously, read it. It's almost like a Groundswell Cliff Notes for sales people. I particularly love his third point. Of course I haven't "gotten" it yet, but I've got a whole life time, for crying out loud. Seriously, this is a great read and a fast one. And finally, in the department of longer reads, a thanks and shout out to Paul Greenberg for mentioning Oracle's new iPad app for Siebel CRM in his ZDNet blog. Hey, I warned you...not much happened today. Per se!

    Read the article

  • Why would a server not send a SYN/ACK packet in response to a SYN packet

    - by codemonkey
    Lately, we've become aware of a TCP connection issue that is mostly limited to mac and Linux users who browse our websites. From the user perspective, it presents itself as a really long connection time to our websites (11 seconds). We've managed to track down the technical signature of this problem, but can't figure out why it is happening or how to fix it. Basically, what is happening is that the client's machine is sending the SYN packet to establish the TCP connection and the web server receives it, but does not respond with the SYN/ACK packet. After the client has sent many SYN packets, the server finally responds with a SYN/ACK packet and everything is fine for the remainder of the connection. And, of course, the kicker to the problem: it is intermittent and does not happen all the time (though it does happen between 10-30% of the time) We are using Fedora 12 Linux as the OS and Nginx as the web server.

    Read the article

  • Apache2 BufferedLogs On - anybody using it ?

    - by Qiqi
    Greetings, I am wondering, whether anybody is using BufferedLogs On with Apache2 and found any issues ? Feature is marked as experimental, but for many years now, so I guess it's rather pretty stable. I am running some servers with constrained disk IO capacity at the moment, so I turned it on hoping that even a small benefit could help in the long run ;-) I do have several to several hundreds requests per seconds so by my thoughts there is really no need to write to log after each request, cause honestly I don't think that my filesystem is the best handler for many unnecessary writes. (OCFS2 shared among several DomUs in the Xen)

    Read the article

  • Matte or non-widescreen laptop? Do they exist?

    - by Alan Harris-Reid
    Does anyone know of any matte-screen laptops being sold now (15.6 or 17") in the UK? All I can find is the Dell Vostro 3500/3700 range, but there is a premium of around £200 over the price of their Inspiron range (for the 17" model), and I find it hard justifying the extra cost just to have a matte screen. I do not like glossy screens, but it seems the laptop industry has gone the way of "glossy is better - let's get rid of matte". I have read and heard from other developers that as long as there are no strong light sources to reflect off the screen, one can soon get used to a glossy screen, but I am yet to be convinced. I would also be interested if anyone knows of any non 16:9 screen laptops. I find this ratio too wide and not high-enough for the work I do. 16:10 or lower would be better. Any opinions would be appreciated. Alan

    Read the article

  • How does a segment based rendering engine work?

    - by Calmarius
    As far as I know Descent was one of the first games that featured a fully 3D environment, and it used a segment based rendering engine. Its levels are built from cubic segments (these cubes may be deformed as long as it remains convex and sides remain roughly flat). These cubes are connected by their sides. The connected sides are traversable (maybe doors or grids can be placed on these sides), while the unconnected sides are not traversable walls. So the game is played inside of this complex. Descent was software rendered and it had to be very fast, to be playable on those 10-100MHz processors of that age. Some latter levels of the game are huge and contain thousands of segments, but these levels are still rendered reasonably fast. So I think they tried to minimize the amount of cubes rendered somehow. How to choose which cubes to render for a given location? As far as I know they used a kind of portal rendering, but I couldn't find what was the technique used in this particular kind of engine. I think the fact that the levels are built from convex quadrilateral hexahedrons can be exploited.

    Read the article

< Previous Page | 277 278 279 280 281 282 283 284 285 286 287 288  | Next Page >