Search Results

Search found 21539 results on 862 pages for 'hanging process'.

Page 460/862 | < Previous Page | 456 457 458 459 460 461 462 463 464 465 466 467  | Next Page >

  • How to view special characters in SQL Management Studio

    - by B Z
    Sql 2005 I have a text column that has special characters stored e.g. CR, LF, but I don't know what they are. I would like to view these characters in management studio. Something like in Notepad ++ Show Symbol Show All Characters. My Goal: I am working on a data conversion from one database to another. When the data is converted and viewed in the native application it is displaying some funky characters like a pipe character. I would like to eliminate these characters during the conversion process.

    Read the article

  • Advice on selecting programming languages to concentrate on? (2nd year IT security student)

    - by Tyler J Fisher
    I'm in the process of considering which programming languages I should devote the majority of my coding studies to. I'm a 2nd year CS student, majoring in IT security. What I want to do/work with: Intelligence gathering Relational databases Virus design Snort network IPS Current coding experience (what I'm going to keep): Java - intermediate HTML5 - intermediate SQL (MySQL, Oracle 11g) - basic BASH - basic I'm going to need to learn (at least) one of the following languages in order to be successful in my field. Languages to add (at least 1): Ruby (+Metasploit) C++ (virus design, low-level driver interaction, computationally intensive applications) Python (import ALL the things) My dilemma: If I diversify too broadly, I won't be able to focus on, and improve in a specific niche. Does anyone have any advice as to how I should select a language? What I'm considering + why I'm leaning towards Ruby because of Metasploit support, despite lower efficiency when compared to Python. Any suggestions based on real-world experience? Should I focus on Ruby, Python, or C++? Both Ruby, and Python have been regarded as syntactically similar to Java which my degree is based around. I'm going to be studying C++ in two years as a component of my malicious code class. Thanks, Tyler

    Read the article

  • multicast tcpdump and subscriptions

    - by Karoly Horvath
    From the multicast howto: IP_ADD_MEMBERSHIP. Recall that you need to tell the kernel which multicast groups you are interested in. If no process is interested in a group, packets destined to it that arrive to the host are discarded. If you don't do that, you won't see those packets with tcpdump. Is it possible to subscribe to all multicast traffic so I can do a tcpdump for all existing traffic? I would think IGMP doesn't allow this, so probably not.. but maybe you can configure a switch to still send all multicast traffic. Is that possible? Is it possible to do subscription (for a specific IP) with a command line tool? (note: I know how to do this in C.. but would prefer to use an existing tool and not compile a separate program for this)

    Read the article

  • Enjoy Seamless Reading at Twitter in Chrome

    - by Asian Angel
    Twitter can be a lot of fun but having to constantly use the More Button to view a large number of tweets is frustrating. All that you need to be rid of that frustration is the More Tweets! extension for Google Chrome. Before Here it is…the classic “More Button”. If you are only interested in viewing a few tweets on occasion then it is not a problem. But if you are looking at a large number of tweets on a daily basis then it can be very frustrating. Notice the last tweet from TinyHacker shown here… After After installing the extension the only thing that you will need to do is refresh your Twitter page if you had it open before-hand. Now there will be a seamless connection from page to page when you are reading through tweets. You can see the TinyHacker tweet from above followed oh so nicely by tweets from the second page…this is definitely an improvement. For those who may be curious if you are quick enough with your mouse you can see what the “automated connection process” looks like. Conclusion If you are tired of constantly clicking the “More Button” and just want to read tweets without interruption then you will be very satisfied after adding this extension to your browser. Links Download the More Tweets! extension (Google Chrome Extensions) Similar Articles Productive Geek Tips Integrate Twitter With Microsoft OutlookMake Mail.app’s Reading Pane More Like OutlookBlip.fm is a Fun Social Way to Share MusicDisable YouTube Comments while using ChromeAdd Shareaholic Goodness to Google Chrome TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips DVDFab 6 Revo Uninstaller Pro Registry Mechanic 9 for Windows PC Tools Internet Security Suite 2010 Classic Cinema Online offers 100’s of OnDemand Movies OutSync will Sync Photos of your Friends on Facebook and Outlook Windows 7 Easter Theme YoWindoW, a real time weather screensaver Optimize your computer the Microsoft way Stormpulse provides slick, real time weather data

    Read the article

  • SQL Cruise Alaska 2011

    - by Grant Fritchey
    I had the extreme good fortune to get sent on the last SQL Cruise to Alaska. I love my job. In case you don't what this is, SQL Cruise is a trip on a cruise ship during which you get to attend classes while on the boat, learning all about SQL Server and related topics as well as network with the instructors and the other Cruisers. Frankly, it's amazing. Classes ran from Monday, 5/30, to Saturday, 6/4. The networking was constant, between classes, at night on cruise ship, out on excursions in Alaskan rainforests and while snorkeling in ocean waters. Here's a run down of the experience from my point of view. Because I couldn't travel out 2 days early, I missed the BBQ that occurred the day before the cruise when many of the Cruisers received their swag bags. Some of that swag came from Red Gate. I researched what was useful on a cruise like this and purchased small flashlights and binoculars for all the Cruisers. The flashlights were because, depending on your cabin, ships can be very dark. The binoculars were so that the cruisers could watch all the beautiful landscape as it flowed by. I would have liked to have been there when the bags were opened, but I heard from several people that they appreciated the gifts. Cruisers "In" the hot tub. Pictured: Marjory Woody, Michele Grondin, Kyle Brandt, Grant Fritchey, John Halunen Sunday I went to board the ship with my wife. We had a bit of an adventure because I messed up our documents. It all worked out and we got on board to meet up at the back of the boat at one of the outdoor bars with the other Cruisers, thanks to tweets letting everyone know where to go. That was the end of electronic coordination on the trip (connectivity in Alaska was horrible for everyone except AT&T). The Cruisers were a great bunch of people and it was a real honor to meet them and get to spend time with them. After everyone settled into their cabins, our very first activity was a contest, sponsored by Red Gate. The Cruisers, in an effort to get to know each other and the ship, were required to go all over taking various photographs, some of them hilarious. The winning team of three would all win prizes. Some of the significant others helped out and I tagged along with a team that tied for first but lost the coin toss. The winning team consisted of Christina Leo (blog|twitter), Ryan Malcom (twitter), Neil Hambly (blog|twitter). They then had to do math and identify the cabin with the lowest prime number, oh, and get a picture of it and be the first to get back up to the bar where we were waiting. Christina came in first and very happily carried home an Ipad2. Ryan won a 1TB portable hard drive and Neil won a wireless mouse (picture below, note my special SQL Server Central Friday Shirt. Thanks Steve (blog|twitter)). Winners: Christina Leo, Neil Hambly, Ryan Malcolm. Just Lucky: Grant Fritchey Monday morning classes started. Buck Woody (blog|twitter) was a special guest speaker on this cruise. His theme was "Three C's on the High Seas: Career, Communication and Cloud." The first session was all on Career. I'm not going to type out all my notes from the session, but let's just say, if you get the chance to hear Buck talk about how to manage your career, I suggest you attend. I have a ton of blog posts that I'll be putting together over the next several months (yes, months) both here and over on ScaryDBA. I also have a bunch of work I'm going to be doing to get my career performance bumped up a notch or two (and let's face it, that won't be easy). Later on Monday, Tim Ford (blog|twitter) did a session on DMOs. Specifically the session was on Tim's Period Table of DMOs that he has put together, and how to use some of the more interesting DMOs in your day to day job. It was a great session, packed with good information. Next, Brent Ozar (blog|twitter) did a session on how to monitor and guide SAN configuration for the DBA that doesn't have access to the SAN. That was some seriously useful information. Tuesday morning we only had a single class. Kendra Little (blog|twitter) taught us all about "No Lock for Yes Fun".  It was all about the different transaction isolation levels and how they work. There is so often confusion in this area and Kendra does a great job in clarifying the information. Also, she tosses in her excellent drawings to liven up the presentation. Then it was excursion time in Juneau. My wife and I, along with several other Cruisers, took a hike up around the Mendenhall Glacier. It was absolutely beautiful weather and walking through the Alaskan rain forest was a treat. Our guide, Jason, was a great guy and it was a good day of hiking. Wednesday was an all day excursion in Skagway. My wife and I took the "Ghost and Good Time Girls" walking tour that ended up at a bar that used to be a brothel, the Red Onion. It was a great history of the town. We went back out and hit a few museums and exhibits. We also hiked up the side of the mountain to see the Dewey Lake and some great views of the town. Finally we hiked out to the far side of town to see the Gold Rush cemetery. Hiking done we went back to the boat and had a quiet dinner on our own. Thursday we cruised through Glacier Bay and saw at least four different glaciers including sitting next to the Marjory Glacier for  about an hour. It was amazing. Then it got better. We went into class with Buck again, this time to talk about Communication. Again, I've got pages of notes that I'm going to be referring back to for some time to come. This was an excellent opportunity to learn. Snorkelers: Nicole Bertrand, Aaron Bertrand, Grant Fritchey, Neil Hambly, Christina Leo, John Robel, Yanni Robel, Tim Ford Friday we pulled into Ketchikan. A bunch of us went snorkeling. Yes, snorkeling. Yes, in Alaska. Yes, snorkeling in the ocean in Alaska. It was fantastic. They had us put on 7mm thick wet suits (an adventure all by itself) so it was basically warm the entire time we were in the water (except for the occasional squirt of cold water down my back). Before we got in the water a bald eagle flew up and landed about 15 feet in front of us, which was just an incredible event. Then our guide pointed out about 14 other eagles in the area, hanging out in the trees. Wow! The water was pretty clear and there was a ton of things to see. That was absolutely a blast. Back on the boat I presented a session called Execution Plans: The Deep Dive (note the nautical theme). It seemed to go over well and I had several good questions come out of the session that will lead to new blog posts. After I presented, it was Aaron Bertrand's (blog|twitter) turn. He did a session on "What's New in Denali" that provided a lot of great information. He was able to incorporate new things straight out of Tech-Ed, so this was expanded beyond his usual presentation. The man really knows what he's talking about and communicates it well. Saturday we were travelling so there was time for a bunch of classes. Jeremiah Peschka (blog|twitter) did a great overview of some of the NoSQL databases and what they should be used for. The session was called "The Database is Dead" but it was really about how there are specific uses for these databases that SQL Server doesn't fill, but also that these databases can't replace SQL Server in other areas. Again, good material. Brent Ozar presented again with a session on Defensive Indexing. It was an overview of how indexes work and a deep dive into how to apply them appropriately in your databases to better support access. A good session, as you would expect. Then we pulled into Victoria, BC, in Canada and had a nice dinner with several of the Cruisers, including Denny Cherry (blog|twitter). After that it was back to Seattle on Sunday. By the way, the Science Fiction Museum in Seattle isn't a Science Fiction Museum any more. I was very disappointed to discover this. Overall, it was a great experience. I'm extremely appreciative of Red Gate for sending me and for Tim, Brent, Kendra and Jeremiah for having me. The other Cruisers were all amazing people and it was an honor & privilege to meet them and spend time with them. While this was a seriously fun time, it was also a very serious training opportunity with solid information coming from seasoned industry pros.

    Read the article

  • Microsoft&rsquo;s new technical computing initiative

    - by Randy Walker
    I made a mental note from earlier in the year.  Microsoft literally buys computers by the truckload.  From what I understand, it’s a typical practice amongst large software vendors.  You plug a few wires in, you test it, and you instantly have mega tera tera flops (don’t hold me to that number).  Microsoft has been trying to plug away at their cloud services (named Azure).  Which, for the layman, means Microsoft runs your software on their computers, and as demand increases you can allocate more computing power on the fly. With this in mind, it doesn’t surprise me that I was recently sent an executive email concerning Microsoft’s new technical computing initiative.  I find it to be a great marketing idea with actual substance behind their real work.  From the programmer academic perspective, in college we dreamed about this type of processing power.  This has decades of computer science theory behind it. A copy of the email received.  (note that I almost deleted this email, thinking it was spam due to it’s length) We don't often think about how complex life really is. Take the relatively simple task of commuting to and from work: it is, in fact, a complicated interplay of variables such as weather, train delays, accidents, traffic patterns, road construction, etc. You can however, take steps to shorten your commute - using a good, predictive understanding of a few of these variables. In fact, you probably are already taking these inputs and instinctively building a predictive model that you act on daily to get to your destination more quickly. Now, when we apply the same method to very complex tasks, this modeling approach becomes much more challenging. Recent world events clearly demonstrated our inability to process vast amounts of information and variables that would have helped to more accurately predict the behavior of global financial markets or the occurrence and impact of a volcano eruption in Iceland. To make sense of issues like these, researchers, engineers and analysts create computer models of the almost infinite number of possible interactions in complex systems. But, they need increasingly more sophisticated computer models to better understand how the world behaves and to make fact-based predictions about the future. And, to do this, it requires a tremendous amount of computing power to process and examine the massive data deluge from cameras, digital sensors and precision instruments of all kinds. This is the key to creating more accurate and realistic models that expose the hidden meaning of data, which gives us the kind of insight we need to solve a myriad of challenges. We have made great strides in our ability to build these kinds of computer models, and yet they are still too difficult, expensive and time consuming to manage. Today, even the most complicated data-rich simulations cannot fully capture all of the intricacies and dependencies of the systems they are trying to model. That is why, across the scientific and engineering world, it is so hard to say with any certainty when or where the next volcano will erupt and what flight patterns it might affect, or to more accurately predict something like a global flu pandemic. So far, we just cannot collect, correlate and compute enough data to create an accurate forecast of the real world. But this is about to change. Innovations in technology are transforming our ability to measure, monitor and model how the world behaves. The implication for scientific research is profound, and it will transform the way we tackle global challenges like health care and climate change. It will also have a huge impact on engineering and business, delivering breakthroughs that could lead to the creation of new products, new businesses and even new industries. Because you are a subscriber to executive e-mails from Microsoft, I want you to be the first to know about a new effort focused specifically on empowering millions of the world's smartest problem solvers. Today, I am happy to introduce Microsoft's Technical Computing initiative. Our goal is to unleash the power of pervasive, accurate, real-time modeling to help people and organizations achieve their objectives and realize their potential. We are bringing together some of the brightest minds in the technical computing community across industry, academia and science at www.modelingtheworld.com to discuss trends, challenges and shared opportunities. New advances provide the foundation for tools and applications that will make technical computing more affordable and accessible where mathematical and computational principles are applied to solve practical problems. One day soon, complicated tasks like building a sophisticated computer model that would typically take a team of advanced software programmers months to build and days to run, will be accomplished in a single afternoon by a scientist, engineer or analyst working at the PC on their desktop. And as technology continues to advance, these models will become more complete and accurate in the way they represent the world. This will speed our ability to test new ideas, improve processes and advance our understanding of systems. Our technical computing initiative reflects the best of Microsoft's heritage. Ever since Bill Gates articulated the then far-fetched vision of "a computer on every desktop" in the early 1980's, Microsoft has been at the forefront of expanding the power and reach of computing to benefit the world. As someone who worked closely with Bill for many years at Microsoft, I am happy to share with you that the passion behind that vision is fully alive at Microsoft and is carried out in the creation of our new Technical Computing group. Enabling more people to make better predictions We have seen the impact of making greater computing power more available firsthand through our investments in high performance computing (HPC) over the past five years. Scientists, engineers and analysts in organizations of all sizes and sectors are finding that using distributed computational power creates societal impact, fuels scientific breakthroughs and delivers competitive advantages. For example, we have seen remarkable results from some of our current customers: Malaria strikes 300,000 to 500,000 people around the world each year. To help in the effort to eradicate malaria worldwide, scientists at Intellectual Ventures use software that simulates how the disease spreads and would respond to prevention and control methods, such as vaccines and the use of bed nets. Technical computing allows researchers to model more detailed parameters for more accurate results and receive those results in less than an hour, rather than waiting a full day. Aerospace engineering firm, a.i. solutions, Inc., needed a more powerful computing platform to keep up with the increasingly complex computational needs of its customers: NASA, the Department of Defense and other government agencies planning space flights. To meet that need, it adopted technical computing. Now, a.i. solutions can produce detailed predictions and analysis of the flight dynamics of a given spacecraft, from optimal launch times and orbit determination to attitude control and navigation, up to eight times faster. This enables them to avoid mistakes in any areas that can cause a space mission to fail and potentially result in the loss of life and millions of dollars. Western & Southern Financial Group faced the challenge of running ever larger and more complex actuarial models as its number of policyholders and products grew and regulatory requirements changed. The company chose an actuarial solution that runs on technical computing technology. The solution is easy for the company's IT staff to manage and adjust to meet business needs. The new solution helps the company reduce modeling time by up to 99 percent - letting the team fine-tune its models for more accurate product pricing and financial projections. Our Technical Computing direction Collaborating closely with partners across industry and academia, we must now extend the reach of technical computing even further to help predictive modelers and data explorers make faster, more accurate predictions. As we build the Technical Computing initiative, we will invest in three core areas: Technical computing to the cloud: Microsoft will play a leading role in bringing technical computing power to scientists, engineers and analysts through the cloud. Existing high- performance computing users will benefit from the ability to augment their on-premises systems with cloud resources that enable 'just-in-time' processing. This platform will help ensure processing resources are available whenever they are needed-reliably, consistently and quickly. Simplify parallel development: Today, computers are shipping with more processing power than ever, including multiple cores, but most modern software only uses a small amount of the available processing power. Parallel programs are extremely difficult to write, test and trouble shoot. However, a consistent model for parallel programming can help more developers unlock the tremendous power in today's modern computers and enable a new generation of technical computing. We are delivering new tools to automate and simplify writing software through parallel processing from the desktop... to the cluster... to the cloud. Develop powerful new technical computing tools and applications: We know scientists, engineers and analysts are pushing common tools (i.e., spreadsheets and databases) to the limits with complex, data-intensive models. They need easy access to more computing power and simplified tools to increase the speed of their work. We are building a platform to do this. Our development efforts will yield new, easy-to-use tools and applications that automate data acquisition, modeling, simulation, visualization, workflow and collaboration. This will allow them to spend more time on their work and less time wrestling with complicated technology. Thinking bigger There is so much left to be discovered and so many questions yet to be answered in the fascinating world around us. We believe the technical computing community will show us that we have not seen anything yet. Imagine just some of the breakthroughs this community could make possible: Better predictions to help improve the understanding of pandemics, contagion and global health trends. Climate change models that predict environmental, economic and human impact, accessible in real-time during key discussions and debates. More accurate prediction of natural disasters and their impact to develop more effective emergency response plans. With an ambitious charter in hand, this new team is ready to build on our progress to-date and execute Microsoft's technical computing vision over the months and years ahead. We will steadily invest in the right technologies, tools and talent, and work to bring together the technical computing community. I invite you to visit www.modelingtheworld.com today. We welcome your ideas and feedback. I look forward to making this journey with you and others who want to answer the world's biggest questions, discover solutions to problems that seem impossible and uncover a host of new opportunities to change the world we live in for the better. Bob

    Read the article

  • Webcast - Oracle Database In-Memory Option

    - by Thanos Terentes Printzios
    Next to the recent announcement by Larry Ellison on the Future of the Database, we are happy to share this exclusive series of live webcasts from Oracle Database Product Management, where you can learn more about the brand new Oracle Database 12c In-Memory option. Oracle Database In-Memory is Oracle’s new memory-optimized technology that transparently accelerates analytic, data warehousing, and reporting workloads, while also accelerating transaction processing (OLTP) workloads. Participants will learn about Oracle Database In-Memory benefits, features, and leading edge architecture.  The Database In-Memory architecture provides the ability to easily process data orders of magnitude faster by simply enabling the feature and identifying tables to bring in-memory without application changes. Details on Oracle Database In-Memory’s ease of use and management, scalability, and availability will also be covered. Please join us to learn more about Oracle Database In-Memory and get first-hand knowledge of this important new feature. Delivery Format This FREE online LIVE eSeminar will be delivered over the Web.These Oracle webcasts are FREE for Customers, System Integrators, ISVs, VARs and Platform Partners. Presenter: Richard Jacobs, Oracle Solution Architect  Europe Webcast 1 Date: August 29, 2014 @ 10:00 am to 11:00 am Central European Summer Time (CEST)Register Here! Europe Webcast 2 Date: September 29, 2014 @ 10:00 am to 11:00 am Central European Summer Time (CEST)Register Here!

    Read the article

  • How do *you* track and document routine maintenance?

    - by Zak
    What software or system do you guys out on server fault use to remind you to do routine maintenance? How do you checklist and log the various items you are supposed to check? Do you have an internal process document? Do you have cron mail you every week with reminders to check system logs? Also, do you work on a team to do system maintenance, and if so, how do you coordinate who will do what maintenance? If you use a bug/issue tracking system to enter tasks, do you have a cron job enter recurring tasks?

    Read the article

  • Windows 7 - Non Admin run as Admin for Explorer - still can't see all tmp internet files

    - by Steve
    I'm trying to retrieve video files from the IE 8 cache for a user that's not an admin in Win 7. As a non admin user, I run Explorer as admin and still can't see the temp internet files for the non admin user. Only if I login as a user that is admin can I see the files. Is there any way I can see the files w/o having to go through the login process? Essentially, I want the video file from this page and others like it: http://video.yahoo.com/watch/111585/1027823

    Read the article

  • Handling range in CNAME

    - by Imran
    We have different set of CNAMEs pointing to different subdomains. These subdomains (a.domain.com, b.domain.com) are pointing to different IPs on different machines. # Server A a1.domain.com pointing to a.domain.com a2.domain.com pointing to a.domain.com .. aN.domain.com pointing to a.domain.com # Server B b1.domain.com pointing to b.domain.com b2.domain.com pointing to b.domain.com .. bN.domain.com pointing to b.domain.com Currently, we have to add individual CNAME entries (eg. a1... aN) against a single subdomain (a.dominan.com). We repeat the above process for every new server which is actually another subdomain (e.g. c.domain.com). Is there a way we can specify a range of CNAMEs (e.g. [a1..a25].domain.com point to a.domain.com) instead of adding separate CNAME etnries? Is there any possibility to handle this at DNS or webserver (apache or Nginx) level?

    Read the article

  • grep pattern interpretted differently in 2 different systems with same grep version

    - by Lance Woodson
    We manufacture a linux appliance for data centers, and all are running fedora installed from the same kickstart process. There are different hardware versions, some with IDE hard drives and some SCSI, so the filesystems may be at /dev/sdaN or /dev/hdaN. We have a web interface into these appliances that show disk usage, which is generated using "df | grep /dev/*da". This generally works for both hardware versions, giving an output like follows: /dev/sda2 5952284 3507816 2137228 63% / /dev/sda5 67670876 9128796 55049152 15% /data /dev/sda1 101086 11976 83891 13% /boot However, for one machine, we get the following result from that command: Binary file /dev/sda matches It seems that its grepping files matching /dev/*da for an unknown pattern for some reason, only on this box that is seemingly identical in grep version, packages, kernel, and hardware. I switched the grep pattern to be "/dev/.da" and everything works as expected on this troublesome box, but I hate not knowing why this is happening. Anyone have any ideas? Or perhaps some other tests to try?

    Read the article

  • Problem after mysql-server installation I cant install any thing in ubuntu 12.04.1 now

    - by mohammed ezzi
    I'm not an advanced user of Linux and I tried to install work with database so I installed Mysql-server, I think I did same thing wrong so I get in trouble and now I cant install any thing and this what I get when I use apt-get -f install : root@me:~# apt-get -f install Reading package lists... Done Building dependency tree Reading state information... Done Correcting dependencies... Done The following extra packages will be installed: mysql-server mysql-server-5.5 Suggested packages: tinyca mailx The following packages will be upgraded: mysql-server mysql-server-5.5 2 upgraded, 0 newly installed, 0 to remove and 194 not upgraded. 2 not fully installed or removed. Need to get 0 B/8,737 kB of archives. After this operation, 15.4 kB of additional disk space will be used. Do you want to continue [Y/n]? y dpkg: dependency problems prevent configuration of mysql-server-5.5: mysql-server-5.5 depends on mysql-server-core-5.5 (= 5.5.24-0ubuntu0.12.04.1); however: Version of mysql-server-core-5.5 on system is 5.5.28-0ubuntu0.12.04.3. dpkg: error processing mysql-server-5.5 (--configure): dependency problems - leaving unconfigured No apport report written because MaxReports is reached already dpkg: dependency problems prevent configuration of mysql-server: mysql-server depends on mysql-server-5.5; however: Package mysql-server-5.5 is not configured yet. dpkg: error processing mysql-server (--configure): dependency problems - leaving unconfigured No apport report written because MaxReports is reached already Errors were encountered while processing: mysql-server-5.5 mysql-server E: Sub-process /usr/bin/dpkg returned an error code (1) I tried to remove mysql-server but nothing happened.

    Read the article

  • Lingering tcp connection in LISTEN state

    - by Silvio Donnini
    My java application can sometimes be killed by an external script. This can be done either with SIGTERM or with SIGKILL. The application is a server which receives many connections per second, and it can be killed while trying to serve them. I would like to restart the application whenever it's killed, so I have prepared a script for that purpose. The problem is that, once the app has been killed, the new application instance can't bind to the port used by the previous instance, because the "Address is already in use". The previous instance's process has been definitely terminated, anyway the offending listening port is still there, but it is assigned to bash (or sh on other machines). Obviouly, my goal is to restart the application and let it bind successfully to the previous address. I've tried waiting more than 200 seconds before restarting to no avail, anyway I can't afford to wait that much. I've encountered this problem on all the machines I've ran the application (which is a jetty server with java 1.6). Any suggestion is appreciated, thanks, Silvio

    Read the article

  • linux to linux, 10TB transfer?

    - by lostincode
    I've looked at all the previous similar questions, but the answers seemed to be all over the place and no one was moving a lot of data (100GB != 10TB). I've got about 10TB that I need to move from one raid to another, gigabit net, XFS file systems. My biggest concern is having the transfer die midway and not being able to resume easily. Speed would be nice, but ensuring transfer is much more important. Normally I'd just tar & netcat, but the raid I'm moving from has been super flaky as of late and I need to be able to recover and resume if it drops mid process. Should I be looking at rsync?

    Read the article

  • SSH keys fail for one user

    - by Eli
    I just set up a new Debian server. I disabled root SSH and password auth, so you've gotta use a key file. For my primary user, everything works exactly as expected. I used ssh-keygen -t dsa and got myself a public and private key. Put one in authorized keys, put the other in a pem file locally. I wanted to create a user that I can deploy things with, so I did basically the same process. I addusered it, made a .ssh folder, ran ssh-keygen -t dsa (I also tried RSA), put the keys in their appropriate locations. No luck. I'm getting a Permission denied (publickey) error. When I use the exact same keys as the account that works, same error. When I enable password authentication, I can log in via SSH with the password. How do I debug this?

    Read the article

  • How to learn & introduce scrum in small startup?

    - by Jens Bannmann
    In a few months, a friend will establish his startup software company, and I will be the software architect with one additional developer. Though we have no real day-to-day experience with agile methods, I have read much "overview" type of material on them, and I firmly believe they are a good - if not the only - way to build software. So with this company, I want to go for iterative, agile development from day 1, preferably something light-weight. I was thinking of Scrum, but the question is: what is the best way for me and my colleagues to learn about it, to introduce it (which techniques when etc) and to evaluate whether we should keep it? Background which might be relevant: we're all experienced developers around the same age with similar professional mindset. We have worked together in the past and afterwards at several different companies, mostly with a Java/.NET focus. Some are a bit familiar with general ideas from the agile movement. In this startup, I have great power over tools, methods and process. The startup's product will be developed from scratch and could be classified as middleware. We have some "customer" contacts in the industry who could provide input as soon as we get to an alpha stage.

    Read the article

  • How to cope with runaway Flash plugin in Google Chrome browser?

    - by Norman Ramsey
    I'm using Google Chrome for Linux, version 5.0.307.11 (Official Build 39572) beta with the Linux Flash plugin version 10.0 r32. Quite often, the Flash plugin goes wild and pegs the CPU with about 95% usage. Laptop gets hot, battery drains. I can diagnose the problem with Chrome's little process monitor (shift-Esc), and I can even kill the plugin, but then when I actually want to use Flash on a page, I can't find a way to restart the plugin; I have to exit and restart Chrome, which with 30 tabs open is a huge hit. Does anyone know what causes this problem? Does anyone have a better workaround (or heaven forfend, a fix)? [I struct out both with search and with Google's help site for Chrome.]

    Read the article

  • Ubuntu server failing daily

    - by deanvz
    Symptoms: Server becomes unresponsive - Increase in load, all services stop Loss of connectivity - Ping/SSH Flush MySQL hosts after reboot - As MySQL refuses new connections Intermittent Apache crashes Generally happens early morning hours - 2 days of the week are however excluded Changes made: Updated the OS - to Ubuntu 10.04.4 LTS Not sure if the MySQL server was also updated in the process Current MySQL version - mysql Ver 14.14 Distrib 5.1.63, for debian-linux-gnu (x86_64) using readline 6.1 Updated Plesk from 10.4.4 Update #47 to 11.0.9 Update #23 Rebooted on almost daily basis All crons stopped for the times corresponding to the server crashes Created a MySQL log to monitor the lock times on queries Possible causes: Failing hardware Incorrect software configuration (MySQL, Apache etc) Responsibilities: Small webserver Runs our billing system - WHMCS Responsible for CRONs Bulk-email solution - No delivery times coincide with server crashes Proposed solutions: Move machine over to VM Format and restore the Plesk server backup and take it from there? Side notes: Seems to be a general Apache failure across all our linux servers - Intermittent problem Are we doing something fundamentally wrong in the Apache config? (I understand that this is a secondary question, just making sure that it isnt possibly holding any relevance)

    Read the article

  • Disk space consumed

    - by aravind-zoniac
    I have a very serious problem here in one of my client server. The remote server is installed with REDHAT ES 5.2 and we have a postgresql as database. I was trying to clone the database. The hard drive had 32 GB of free space before taking clone. I started cloning the database and during the process, there was some internet issue and due to this, putty got disconnected before taking clone. So I opened another fresh session and I was able to see only 2.5GB as available space. Also I was not able to see the clone in the psql terminal. Any solution to get the 29GB that was consumed????

    Read the article

  • http, https and ftp is not working but smtp and imap is working

    - by Unicron
    hi all, yesterday on a computer of a friend a strange thing happened. after booting the ports fo http, https and ftp are closed but e-mail is still working. in the control panel the windows firewall seems active even if he tries to deactivate it. i have a suspision that it is the faul of norton internet security 2010, we have tried to uninstall it, but the uninstallation did not work. when using the removal tool from symantec it just goes to 23% and then it crashes. the process ccSvcHst.exe is still running. how can i safeley remove the rest of norton internet security? thanks in advance [edit] norton internet security 2010 is sucesfully removed, but still no connectivity

    Read the article

  • A New Year’s Celebration in June

    - by Kristin Rose
    Happy Oracle New Year Everyone! Last week marked the official start to FY13 and we could not be more pleased with all that lies ahead this quarter, and all that we accomplished in the last…especially our newly updated Oracle PartnerNetwork (OPN) Solutions Catalog. If you thought it was great before, just wait until you see it now. We are ringing in our New Year right by fully equipping partners with the necessary tools they need to have another successful year. The Solutions Catalog will help draw attention to your partner services and offerings, highlighting your expertise. The Solutions Catalog is a centralized and easy way to navigate this customer friendly site. Some of the exciting advancements include: A streamlined search interface A robust lead capture tool that requests the contact information of potential customers A professional display of customer recommendations to showcase your skill set A partner dashboard with enhanced profile creation and an improved publication process Most exciting of all, updating your profile is easier than ever with the updated partner dashboard. Keeping your partner profile up to date will help to ensure customers are looking at the correct information about your company, and can easily stay on-top of any new developments or Specializations you receive. So don’t cut yourself short, be sure to update your profile today if you haven’t already done so. For more information on the exciting upgrades available to you, visit the ‘Resources for Partners’ page or watch Takane Aizeki, Principle Portal Manager at Oracle, walk through the upgraded Solutions Catalog and the different ways to showcase your value as an Oracle solution provider. Cheers,Lydia SmyersGroup Vice PresidentWWA&C and Communications

    Read the article

  • Weirdness After Reinstall The Windows Operating System

    - by Eka Anggraini
    I want to ask, and ask advice from you guys. I have reinstall my OS and successful, I'm turn off the restart a few times is fine .. Later a few hours later, when I turn it on, a sudden there was an error : Windows could not start because the following file is missing or corrupt: \WINDOWS\SYSTEM32\CONFIG\SYSTEM You can attempt to repair this file by starting Windows Setup using the original Setup CD-ROM Select 'R' at the first screen to start repair. So I intend to repair, reinstall all again .. I enter cd windows : press any key .. with blue background text below: Setup is loading files (windows executive) Setup is loading files (hardware abstraction layer) Then I was waiting until half an hour, and no changes, I repeated the process several times is also nil. Please advice and solutions about the problem where? Hardware / cd Windows ?

    Read the article

  • Planning for the Recovery

    - by john.orourke(at)oracle.com
    As we plan for 2011, there are many positive signs in the global economy, but also some lingering issues. Planning no longer is about extrapolating past performance and adjusting for growth. It is now about constantly testing the temperature of the water, formulating scenarios, assessing risk and assigning probabilities.  So how does one plan for recovery and improve forecast accuracy in such a volatile environment?  Here are some suggestions from a recent article I wrote, which was published in the December Financial Planning & Analysis (FP&A) newsletter from the AFP (Association of Financial Professionals): Increase the frequency of forecasting Get more line managers involved in the planning and forecasting process Re-consider what's being measured - i.e. key financial and operational metrics Incorporate risk and probability into forecasts Reduce reliance on spreadsheets - leverage packaged EPM applications To learn more about these best practices, check out the FP&A section of the AFP website and register to receive the FP&A newsletter.  AFP recently launched a new topic area focused on the FP&A function and items of interest to this group of finance professionals.  In addition to the FP&A quarterly newsletter, AFP will be publishing articles, running webinars and will have an FP&A track in their annual conference, which is in Boston next November.  Brian Kalish, AFP's Finance Lead, is hoping this initiative creates a valuable networking and information-sharing resource for FP&A professionals. Here's a link to the FP&A page on the AFP web site:  http://www.afponline.org/pub/res/topics/topics_fpa.html If you register on the site you can access and subscribe to the FP&A newsletter and other resources. Best of luck in your planning for 2011 and beyond!   

    Read the article

  • Application Lifecycle Management Tools

    - by John K. Hines
    Leading a team comprised of three former teams means that we have three of everything.  Three places to gather requirements, three (actually eight or nine) places for customers to submit support requests, three places to plan and track work. We’ve been looking into tools that combine these features into a single product.  Not just Agile planning tools, but those that allow us to look in a single place for requirements, work items, and reports. One of the interesting choices is Software Planner by Automated QA (the makers of Test Complete).  It's a lovely tool with real end-to-end process support.  We’re probably not going to use it for one reason – cost.  I’m sure our company could get a discount, but it’s on a concurrent user license that isn’t cheap for a large number of users.  Some initial guesswork had us paying over $6,000 for 3 concurrent users just to get started with the Enterprise version.  Still, it’s intuitive, has great Agile capabilities, and has a reputation for excellent customer support. At the moment we’re digging deeper into Rational Team Concert by IBM.  Reading the docs on this product makes me want to submit my resume to Big Blue.  Not only does RTC integrate everything we need, but it’s free for up to 10 developers.  It has beautiful support for all phases of Scrum.  We’re going to bring the sales representative in for a demo. This marks one of the few times that we’re trying to resist the temptation to write our own tool.  And I think this is the first time that something so complex may actually be capably provided by an external source.   Hooray for less work! Technorati tags: Scrum Scrum Tools

    Read the article

  • Converting world space coordinate to screen space coordinate and getting incorrect range of values

    - by user1423893
    I'm attempting to convert from world space coordinates to screen space coordinates. I have the following code to transform my object position Vector3 screenSpacePoint = Vector3.Transform(object.WorldPosition, camera.ViewProjectionMatrix); The value does not appear to be in screen space coordinates and is not limited to a [-1, 1] range. What step have I missed out in the conversion process? EDIT: Projection Matrix Perspective(game.GraphicsDevice.Viewport.AspectRatio, nearClipPlaneZ, farClipPlaneZ); private void Perspective(float aspect_Ratio, float z_NearClipPlane, float z_FarClipPlane) { nearClipPlaneZ = z_NearClipPlane; farClipPlaneZ = z_FarClipPlane; float yZoom = 1f / (float)Math.Tan(fov * 0.5f); float xZoom = yZoom / aspect_Ratio; matrix_Projection.M11 = xZoom; matrix_Projection.M12 = 0f; matrix_Projection.M13 = 0f; matrix_Projection.M14 = 0f; matrix_Projection.M21 = 0f; matrix_Projection.M22 = yZoom; matrix_Projection.M23 = 0f; matrix_Projection.M24 = 0f; matrix_Projection.M31 = 0f; matrix_Projection.M32 = 0f; matrix_Projection.M33 = z_FarClipPlane / (nearClipPlaneZ - farClipPlaneZ); matrix_Projection.M34 = -1f; matrix_Projection.M41 = 0f; matrix_Projection.M42 = 0f; matrix_Projection.M43 = (nearClipPlaneZ * farClipPlaneZ) / (nearClipPlaneZ - farClipPlaneZ); matrix_Projection.M44 = 0f; } View Matrix // Make our view matrix Matrix.CreateFromQuaternion(ref orientation, out matrix_View); matrix_View.M41 = -Vector3.Dot(Right, position); matrix_View.M42 = -Vector3.Dot(Up, position); matrix_View.M43 = Vector3.Dot(Forward, position); matrix_View.M44 = 1f; // Create the combined view-projection matrix Matrix.Multiply(ref matrix_View, ref matrix_Projection, out matrix_ViewProj); // Update the bounding frustum boundingFrustum.SetMatrix(matrix_ViewProj);

    Read the article

< Previous Page | 456 457 458 459 460 461 462 463 464 465 466 467  | Next Page >