Search Results

Search found 29250 results on 1170 pages for 'good dumps cvv'.

Page 204/1170 | < Previous Page | 200 201 202 203 204 205 206 207 208 209 210 211  | Next Page >

  • Is there a utility to visualise / isolate and watch application calls

    - by MyStream
    Note: I'm not sure what to search for so guidance on that may be just as valuable as an answer. I'm looking for a way to visually compare activity of two applications (in this case a webserver with php communicating with the system or mysql or network devices, etc) such that I can compare the performance at a glance. I know there are tools to generate data dumps from benchmarks for apache and some available for php for tracing that you can dump and analyse but what I'm looking for is something that can report performance metrics visually from data on calls (what called what, how long did it take, how much memory did it consume, how can that be represented visually in a call stack) and present it graphically as if it were a topology or layered visual with different elements of system calls occupying different layers. A typical visual may consist of (e.g. using swim diagrams as just one analogy): Network (details here relevant to network diagnostics) | ^ back out v | Linux (details here related to firewall/routing diagnostics) ^ back to network | | V ^ back to system Apache (details here related to web request) | | ^ response to V | apache PHP (etc) PHP---------->other accesses to php files/resources----- | ^ v | MySQL (total time) MySQL | ^ V | Each call listed + time + tables hit/record returned My aim would be to be able to 'inspect' a request/range of requests over a period of time to see what constituted the activity at that point in time and trace it from beginning to end as a diagnostic tool. Is there any such work in this direction? I realise it would be intensive on the server, but the intention is to benchmark and analyse processes against each other for both educational and professional reasons and a visual aid is a great eye-opener compared to raw statistics or dozens of discrete activity vs time graphs. It's hard to show the full cycle. Any pointers welcome. Thanks! FROM COMMENTS: > XHProf in conjunction with other programs such as Perconna toolkit > (percona.com/doc/percona-toolkit/2.0/pt-pmp.html) for mySQL run apache > with httpd -X & (Single threaded debug mode and background) then > attach with strace -> kcache grind

    Read the article

  • What would cause a query being ran from SSMS on local box to run slower then from remote box

    - by Racter
    When I run a simply query such as "Select Column1, Column2 from Table A" from within SSMS running on my production SQL Server the results seems to take extremely long (45Min). If I run the same query from my dev system’s SSMS connecting to the production SQL Server the results return within a few seconds (<60sec). One thing I have notices is if the system was just rebooted performance is good for a bit. It is hard to determine a time as I have had it start running slow very quickly after reboot but at most it performed good for 20min and then start acting up. Also, just restarting the SQL service does not resolve the issue or provide a temporary performance boost. Specs for Server are: Windows Server 2003, Enterprise Edition, SP2 4 X Intel Xeon 3.6GHz - 6GB System Memory Active/Active Cluster SQL Server 2005 SP2 (9.0.3239)

    Read the article

  • Web log files analyzer

    - by Peter Štibraný
    I already use Google Analytics on my page, but I'd like to get additional info from log files. I've looked at various packages during last days, but nothing impressed me so far. Some requirements: must work on log file level (I use apache combined logs, but can configure apache to produce other types of logs) can generate static reports (windows/linux) or use GUI (windows only) should be easy to add custom user agents, and rerun analysis if it can recognize installation of eclipse plugins from log, that would be big plus understands google serp position referer should not require two days to setup (awstats, I am looking at you) should be still under active developement (i.e. analog isn't good answer) preferrably free, or at not very expensive :-) Any good analyzers programs out there?

    Read the article

  • Sending a file to uCLinux

    - by Mike
    I have a board running uClinux, and I need to send a file to it, but I'm not sure how... I have a RS232 port, and Ethernet port with an IP address at my disposal, I can telnet to the board, but uCLinux doesn't have a built in FTP client. What sort of options do I have here for transferring files from my Windows 7 (or OpenSUSE) machine(s) to this board? EDIT I just found I have a TFTP server on it: # tftp BusyBox v0.60.5 (2012.07.09-14:05+0000) multi-call binary Usage: tftp [OPTION]... HOST [PORT] But I can't find any good information on how to use TFTP. And looking around google I'm seeing it's good for loading binary images, so I'm assume anything could be sent, but I'm not sure.

    Read the article

  • I want to make my Application Open Source, but I'm unsure...

    - by Joern
    Hi! I have written a website, which is my big idea. Now I'm planning a version for intranets. The big question in my head is: "shall I make it open source?" and next are: What do I have to prepare What do I need for a good start What is the best way to find people with the same interests Who gives me the proof that I won't find a copy of my idea on another page, stolen from my open source code Please Discuss with me, give some good arguments pro and contra. Yours, Joern.

    Read the article

  • what to use instead of laptop-mode?

    - by playcat
    hello, i have ubuntu 10.10 64bit on hp 6735s (turion processor). it overheats, and i'm forced to use turion power control in order to keep core temperature to a reasonable level. one more measure that i use is putting my processors to conservative mode. that way, i'm perfectly happy with its performance, and heat is where it should be. however, after my latest upgrade, something happened - cores are back to ondemand by default, and i'm not sure if turionpowercontrol is working any more (ps axu | grep urion shows no process). in addition, i read somewhere that laptop-mode uses hdd spindown for preserving data/energy, and that hdds have only a limited amount of those spindowns, so laptop-mode usage can actually shorten the life of my hdd. i'm wondering if there is a good way to set my cores to automatically go to conservative mode? also, what's the good way to see what is the voltage my cores use? on windows i use cpuz tools. thx and sorry for the long explanation.

    Read the article

  • svnrepo + trac hosting

    - by Shikhar
    Does anyone know of a good and economic svn + trac hosting site. Specific requirements 1) trac hooks should be in place, which enables commmit messages to be updated in trac issues. 2) It should have emailTotracScript or MailToTracPlugin installed, with which an issue can be reported via email. If its located in Asia pacific it would be great, as time delay from the US is very high. I am already using sourcerepo.com and its very good. Only short coming is they dont have emailtotrac and the time delay is significant. any other inputs would be helpful. TIA

    Read the article

  • Stop Windows Media Player from connecting to Internet/MS using hosts file or alternate method?

    - by Joe
    Is there a way to prevent Windows Media Player from connecting to the internet and MS using the hosts file or other methods? Edit: (Nov 20 2009 at 19:16) I have both VLC and MPC and I do use them. However I am currently using WMP to organize all my music and I hate that WMP is always making outgoing connections. I just tried TCPView and can't believe how many connections WMP makes when you first launch it. I have even disabled everything in its options that relates to connecting to internet. Could any of you recommend a good media player thats also good for organizing your music library like WMP, and doesnt connect to the internet? Preferably one that a WMP user would actually like as much as WMP. The reason I use WMP is because I like its interface, the way its setup and how it looks.

    Read the article

  • Migrate to SSD - NTFS mount point for Program Files

    - by Icode4food
    Here is my thought. I have a new computer that I just built and am considering migrating to a SSD. I have Windows all setup and my Development environment configured so I want to avoid having to re-install a bunch of stuff. My thought is to clone my OS (win7) to the SSD and then mount a HDD partion to C:\Program Files (x86)\ with C being my SSD. This way as far as the programs are concerned they still live on the C drive but in reality they are physically located on the HDD. This seems to me like a good idea but after searching around a bit and not having found anyone else that had the same idea, I'm wondering why not. Maybe I am missing something that is obvious to everyone but me. Why is this a good or a bad idea?

    Read the article

  • Tab Auto-Completion in Mac OS X when using sftp in terminal

    - by AlanTuring
    i have been getting very frustrated lately since the readline functionality has been removed from MacOSX and Tab Auto-Completion doesn't work anymore. So i was wondering if anyone knew a good alternative to use that i could install so i can tab auto-complete files when sftp'd in. I heard that with-readline is a good option for this. If so, how do i get an alias sftp = with-readline sftp to work? I would like to do the same with any other option that isn't with-readline, so i don't have to assign an alias each time i set up a session. I am using Mac OS X 10.8(Mountain Lion) with Homebrew installed. Thanks in advance to anyone who can help me.

    Read the article

  • Video/films organizing software for mac

    - by tig
    I am looking for application that will help me organize movies, clips and other videos I have. I tell it a folder to watch, and then I can mark every video there as watched/unwatched, set rating, add description, tag it. And possibility to open with preferred video player (for me — VLC). Option to get all info from imdb or other source will be very good. yFlicks would be a good one, but it uses quicktime, so it doesn't like any non standard codec and container (for example mkv), it doesn't work well on Snow Leopard. Any suggestions?

    Read the article

  • Registry Cleaner, useful or not

    - by garybo
    Hi, I’m constantly seeing Ad’s about Registry Cleaning. Each time I see one of those Ad’s I remember reading an article (don’t remember who wrote it, but it was posted on one of those geek chat pages) a few years ago about it not being necessary to clean a registry, in fact, the article continued, it and said sometimes it causes more harm than good to run a registry cleaner. I would like to hear your opinion about this, and if you think it is good to use one of these programs, could you recommend a few. Thanks in advance. garybo

    Read the article

  • redirecting output from telnet / nc to file in script fails when cron'd

    - by qhartman
    So, I have device on my network which sits there listening on a port for a connection, and when a connection is made it dumps ascii data out. I need to capture that data to a file. I wrote a dead simple shell script that does this: #!/bin/bash #Config Variables. Age is in Days. DATA_ROOT=/root/data FILENAME=data_`date +%F`.dat HOST=device COMPRESS_AGE=3 #Sanity Checks if [ ! -e $DATA_ROOT ] then echo "The directory $DATA_ROOT seems to not exist. Please create it." exit 1 fi if [ -e $DATA_ROOT/$FILENAME ] then echo "You seem to have extracted data already today. Aborting" exit 1 fi #Get Data nc $HOST 2202 > $DATA_ROOT/$FILENAME #Compress old Data find $DATA_ROOT -type f -mtime +$COMPRESS_AGE -exec gzip {} \; exit 0 It works great when I run it by hand, but when I run it from cron, it doesn't capture any of the output. If I replace nc with telnet I see the initial telnet headers about escape sequences and whatnot, but not the data. Ideas? I've tried forcing bash to act like an interactive shell with -i. I've tried redirecting both stderr and stdout. I know it's got to be some silly simple thing, but I'm utterly failing. This is driving me nuts... EDIT I also just noticed that the nc processes from all my previous attempts at this have been siting sleeping, and when I killed them, cron sent me a bunch of non-sensical error messages. At least now I have something to dig into!

    Read the article

  • Dovecot starting and running, but not listening on any port

    - by Dženis Macanovic
    Among others things I'm in charge of a Debian GNU/Linux (Wheezy) DomU for the mail services of the company i work for. Yesterday one HDD that was used for this particular server has died. After installing Debian again, Dovecot decided to no longer listen on any ports (checked with netstat -l). Other services (like Postfix and MySQL) work without problems. dovecot -n: # 2.1.7: /etc/dovecot/dovecot.conf # OS: Linux 3.2.0-3-amd64 x86_64 Debian wheezy/sid ext3 auth_mechanisms = plain login disable_plaintext_auth = no first_valid_uid = 150 last_valid_uid = 150 mail_gid = mail mail_location = maildir:/var/vmail/%d/%n mail_uid = vmail namespace inbox { inbox = yes location = prefix = } pass db { args = /etc/dovecot/dovecot-sql.conf.ext driver = sql } plugin { sieve = ~/.dovecot.sieve sieve_dir = ~/sieve } service auth { unix_listener /var/spool/postfix/private/auth { group = postfix mode = 0660 user = postfix } unix_listener auth-userdb { group = mail mode = 0666 user = vmail } } service imap-login { inet_listener imaps { port = 993 ssl = yes } } service pop3-login { inet_listener pop3s { port = 995 ssl = yes } } ssl_cert = </etc/ssl/private/mail.crt ssl_key = </etc/ssl/private/mail.key userdb { args = /etc/dovecot/dovecot-sql.conf.ext driver = sql } protocol imap { mail_max_userip_connections = 25 } UID 150 is vmail (I double checked file permissions). I didn't install Dovecot from source, but via apt from the official Debian US mirror. There are no messages concerning Dovecot in /var/log/syslog except for: Oct 21 06:36:29 server dovecot: master: Dovecot v2.1.7 starting up (core dumps disabled) Any ideas?

    Read the article

  • After RAID failure SBS 2008 issues logging in and Exchange store does not mount

    - by Josh R
    today has been one of those days. Yesterday a hard drive in our Dell Poweredge 2900 server failed and the RAID array didn't degrade gracefully, so I called Dell (Server still under warranty) and got an engineer to work though the RAID issues with me. He was a nice guy but didn't do too much. We tried to put the RAID in a state where it was bootable and even though we only lost one disk there are still issues with the server. Once we got the server to boot there was an error message saying that the logonui.exe was corrupted and we needed to run chkdsk. I clicked through the error messages and the login screen never came up. So I power cycled the server and it chkdsk automatically but the login screen didn't appear. I tried safe mode, no difference there either. So the issues I am currently having are: 1) The server boots up, the loading windows screen comes up then it dumps me into a black screen where I can only see my mouse cursor. Ctrl+Esc doesn't work Ctrl+Alt+Del doesn't work 2) Some of the services come up: DHCP, DNS, DFS, and Print come up 3) The exchange information store and transport service don't start - I tried using mmc to connect to services.msc on the computer and start them but they throw an error message of "Can't start because group or dependency failed" Has anyone had a problem like this? Can anyone offer some guidance? Thanks a bunch!

    Read the article

  • Printer recommendation

    - by Coding District
    Hi guys, I'm looking to buy a printer for home use and I'm not sure which one to get. I'm not very good when it comes to printers. Here's what I'm looking for: cheap (least $ per page) good quality (last longest, any specific brands to avoid?) not heavy printing (let's say ~5 pages per week) OK quality (I don't need "the best". I'm not going to print any photos but will need color) can scan, fax, and print I'm currently looking at these two since it's boxing day tomorrow and they're on sale: http://www.bestbuy.ca/EN-CA/product/id/10155178.aspx http://www.bestbuy.ca/en-CA/product/hewlett-packard-hp-officejet-wireless-all-in-one-inkjet-printer-4500-wl-4500-wl/10146663.aspx?path=14c256643988a02e34424eec10028145en02 Can I get some opinions about the above?

    Read the article

  • Move /var directories to to /mnt on an EC2 instance

    - by Geoff Lanotte
    I am trying to work on a standard configuration for a set of EC2 instances running ubuntu 12.04. These servers are going to be primarily web servers for a Ruby on Rails application. When you configure a new large instance, you are given a primary of 8GB and then ephemeral storage of 400 GB that is mounted to /mnt. It seems logical to me to move some directories that have a potential for growth off to the /mnt directory, I was specifically thinking of /var/www and /var/log. My question is two-fold: Is this a good idea or are there pitfalls that I cannot see? If this is a good idea, how should I go about configuring this. I do have the ability to configure new instances and down our old instances. My concern is over long term, doing this in such a way that it prevents downtime. I am a developer with some experience in devops, but mounting drives is something I have not faced before, so explicit directions would be greatly appreciated.

    Read the article

  • Any recommendations on a NAS for a home-super-user?

    - by marc_s
    Can anyone recommend a good NAS for use in a home-server environment? I would request at least 2, preferably 4 disks, and I am most interested in good to excellent throughput for file-server and backup purposes - don't need any of the fancy media-streaming or -sharing features, that's not of interest to me. For a 4 or more disk solution, support for the various RAID levels (0, 1, 1+0, 5) would be a plus - especially if supported in hardware (rather than just a software emulation). I just need a place to put my collection of data, ISO images, and so forth - and since several external disks (self-built and off-the-shelf) have failed so far, I'm looking into a more reliable solution. Marc

    Read the article

  • Setting up logging for a remote backup script

    - by Brian Dainis
    So I wrote up a short script that I am planning to run via a cron job daily to package up my site files and send them to a remote location. I also plan to incorporate DB dumps, but I have not gotten that far yet. My issue today however is that Im am uncertain how to log the output of each command for errors, warnings, or other pertinent information the command may output. I would also like to install sometype of fail safe so if something goes horribly wrong the script will stop dead in its tracks and notify me via email or something. Ok the email thing is not as critical, but would be nice. Does anybody have any ideas for that? Here is what I have so far. By the way, both servers are CentOS 6.2 running standard LAMP. #!/bin/sh ################################# ### Set Vars ################################# THEDATE=`date +%m%d%y%H%M` ################################# ### Create Archives ################################# tar -cf /root/backups/files/server_BAK_${THEDATE}.tar -C / var/www/vhosts gzip /root/backups/files/server_BAK_${THEDATE}.tar ################################# ### Send Data to Remote Server ################################# scp /root/backups/files/server_BAK_${THEDATE}.tar.gz user@host:/home/bak1/ftp/backups/ ################################# ### Remove Data from this Server ################################# rm -rf /root/backups/files/server_BAK_${THEDATE}.tar.gz

    Read the article

  • Alternative to Canned Response in Gmail

    - by Stuck
    I have a mailbox that i share with a colleague. We want a good way to store templates for e-mails that we send often like answers to common questions and so on. We have tried to use Canned Response to store this templates but that GUI really sucks and is kind of unusable for other things then signatures and stuff. Is anyone aware of a good alternative to this? We need to be able to share this templates. So it must be stored "in the cloud". We want "as easy access" as possible directly in gmail. A firefox plugin would be fine since we both use Firefox. We use both Mac and PC so the solution must be cross-platform. Anyone have any ideas on how to solve this?

    Read the article

  • MS Access 2007 end user access

    - by LtDan
    I need some good advise. I have used Access for many years and I use Sharepoint but never the two combined. My newly created Access db needs to be shared with many users across the organization. The back end is SQL and the old way to distribute the database would be placing the db on a shared drive, connecting their PC ODBC connections to the SQL db and then they would open the database and have at it. This has become the OLD way. What is the best (and simpliest) way to allow the end users to utilize a frontend for data entry/edit reporting etc. Can I create a link through SharePoint and the user just open it from there. Your good advise is greatly approciated.

    Read the article

  • Best practice for administering a (hadoop) cluster

    - by Alex
    Dear all, I've recently been playing with Hadoop. I have a six node cluster up and running - with HDFS, and having run a number of MapRed jobs. So far, so good. However I'm now looking to do this more systematically and with a larger number of nodes. Our base system is Ubuntu and the current setup has been administered using apt (to install the correct java runtime) and ssh/scp (to propagate out the various conf files). This is clearly not scalable over time. Does anyone have any experience of good systems for administering (possibly slightly heterogenous: different disk sizes, different numbers of cpus on each node) hadoop clusters automagically? I would consider diskless boot - but imagine that with a large cluster, getting the cluster up and running might be bottle-necked on the machine serving the OS. Or some form of distributed debian apt to keep the machines native environment synchronised? And how do people successfully manage the conf files over a number of (potentially heterogenous) machines? Thanks very much in advance, Alex

    Read the article

  • Textmate: Find and replace across project with contents of one file from said project

    - by griotspeak
    I have a regular expression to find the text I want (I wrapped the relevant section in custom tags), and I can do it by hand without much issue, but what I want is a way to automatically find and replace throughout the entire project. A macro seems like an OK idea, but it would be nice to have a command (to edit and tweak). sed seems like a good bet, but I am pretty unfamiliar with it. I am not so much asking for a complete solution as I am asking for an example that does something close to what I want. I don't really know of a good way to start.

    Read the article

< Previous Page | 200 201 202 203 204 205 206 207 208 209 210 211  | Next Page >