Search Results

Search found 6912 results on 277 pages for 'stay logged in'.

Page 81/277 | < Previous Page | 77 78 79 80 81 82 83 84 85 86 87 88  | Next Page >

  • How does landscape calculate memory usage?

    - by David Planella
    I'm trying to debug an OOM situation in an Ubuntu 12.04 server, and looking at the Memory graphs in Landscape, I noticed that there wasn't any serious memory usage spike. Then I looked at the output of the free command and I wasn't quite sure how both memory usage results relate to each other. Here's landscape's output on the server: $ landscape-sysinfo System load: 0.0 Processes: 93 Usage of /: 5.6% of 19.48GB Users logged in: 1 Memory usage: 26% IP address for eth0: - Swap usage: 2% Then I ran the free command and I get: $ free -m total used free shared buffers cached Mem: 486 381 105 0 4 165 -/+ buffers/cache: 212 274 Swap: 255 7 248 I can understand the 2% swap usage, but where does the 26% memory usage come from?

    Read the article

  • Mythbuntu initial setup cannt connect to server

    - by Hawke
    I'm really new to linux, and I just installed mythbuntu to a standalone pc, it's all installed ok and I've logged on, and started the setup but I'm having issues. I select language ok, the next screen is database setup, select next but it says can't connect to server and I just loop back. I've done some googling and checked the mysql database password and that is correct, I've also checked that my username belongs to myth tv and it does. Can anyone help? I've tried reinstalling but it doesn't change. Many thanks.

    Read the article

  • Strange display language in gnome shell

    - by khalafuf
    I logged in gnome-shell, and found that the display language is set to some strange asian language (I think) without my prompt. I tried to change the locale settings but found that the default language is English (how?) despite of that strange language. Here's a snapshot, See the strange word instead of "Activity": I'm on Ubuntu 12.04 LTS. Output of locale: LANG=zh_CN.UTF-8 LANGUAGE=zh_CN:en_US:en LC_CTYPE="zh_CN.UTF-8" LC_NUMERIC=en_US.UTF-8 LC_TIME=en_US.UTF-8 LC_COLLATE="zh_CN.UTF-8" LC_MONETARY=en_US.UTF-8 LC_MESSAGES="zh_CN.UTF-8" LC_PAPER=en_US.UTF-8 LC_NAME=en_US.UTF-8 LC_ADDRESS=en_US.UTF-8 LC_TELEPHONE=en_US.UTF-8 LC_MEASUREMENT=en_US.UTF-8 LC_IDENTIFICATION=en_US.UTF-8 LC_ALL= Output of locale -a: C C.UTF-8 de_CH.utf8 en_AG en_AG.utf8 en_AU.utf8 en_BW.utf8 en_CA.utf8 en_DK.utf8 en_GB.utf8 en_IE.utf8 en_IN en_IN.utf8 en_NG en_NG.utf8 en_NZ.utf8 en_PH.utf8 en_SG.utf8 en_US.utf8 en_ZA.utf8 en_ZM en_ZM.utf8 en_ZW.utf8 POSIX zh_CN.utf8 zh_SG.utf8 Solved: This answer did it.

    Read the article

  • Strategies for very fast delivery of webpages.

    - by Cherian
    I run a website Cucumbertown with an initial pay load of nearly 9KB zipped. All my js is delayed loaded with requirejs and modernizer is the only exception. Now all my webpages are Nginx cached and only 10-15% hits go to the backend proxy. And the cache is invalidated by logged in users as proxy_cache_bypass. So for an anonymous user its nearly always a cache hit. I have some basic OS tuning with default via ip dev eth0 initcwnd 15 net.ipv4.tcp_slow_start_after_idle 0 Despite an all cache & large initcwnd my pages still take 2.5 – 3 seconds. I have a yslow score of And page speed at Are there strategies that can help deliver webpages even faster than this? Deliver pages at 1+ second time for 10KB payload? Notes: My servers run of a fairly good data center from Linode at Fremont.

    Read the article

  • Tester that doesn't test

    - by George
    What should I do about a tester that does not test? We have a complicated dry run scenario, that takes a lot of time to execute. Mostly this tester will execute it's tests in very slow way...checking emails, internet, etc. He reports just a few bugs, but! Whenever the official dry-run begins (these are logged with testlink) the tester starts to open new bugs that where not discovered before. Is he not doing his job correctly? Or am I just overlooking how tests work? I'm not his supervisor, but he is testing code that I wrote.

    Read the article

  • How to Recover an Encrypted Home Directory on Ubuntu

    - by Chris Hoffman
    Access an encrypted home directory when you’re not logged in – say, from a live CD – and all you’ll see is a README file. You’ll need a terminal command to recover your encrypted files. You should also back up your mount passphrase ahead of time – you may need this in the future. While eCryptfs normally decrypts your files with your login passphrase, the mount passphrase may be necessary if eCryptfs’s files become lost. HTG Explains: What Is RSS and How Can I Benefit From Using It? HTG Explains: Why You Only Have to Wipe a Disk Once to Erase It HTG Explains: Learn How Websites Are Tracking You Online

    Read the article

  • Should I use the same AddThis tag on multiple sites?

    - by ripper234
    I have an AddThis for one site: <script type="text/javascript" src="http://s7.addthis.com/js/250/addthis_widget.js#pubid=ripper234"> </script> Now I logged into AddThis and wanted to get my tag again, I saw it changed: <script type="text/javascript" src="http://s7.addthis.com/js/300/addthis_widget.js#pubid=ripper234"> </script> Should I use the same tag I got before, or the new tag? What's the difference? Is 250/300 the internal version number?

    Read the article

  • How to find data usage of a user on my website?

    - by Dharmik
    I have a website (project) where users get logged in, do their work and then they log out. I need to build a report that displays how much each person has used of data. (bandwidth, how much was downloaded in Kb, etc) So the process may be like counting start of usage from user login to user logout. I have seen a little about Webalizer and AWStats for something like this, But I am not sure how they work. I have tried Content-Length but some pages don't send content-length.I have also seen mod_bandwidth but still I am little confused. This process is needed for my site because now, our company is thinking of charging per usage and also bandwidth allocation for each users (according to their membership). I haven't worked with this type of tools, I am newbie in this matter. I have done only simple websites not any setting like this in Apache or Linux. My project is in Codeigniter.

    Read the article

  • Pgagent startup script (under the postgres user)

    - by Dominique Guardiola
    Hello, I'm trying to make a clean startup script for pgagent I found one here but I don't see how I can change this : if start-stop-daemon --start --quiet --pidfile /var/run/pgagent.pid \ --exec /usr/bin/pgagent "hostaddr=127.0.0.1 dbname=postgres user=postgres \ password=XXXXXXX";then to launch something like this : su - postgres -c /usr/bin/pgagent "hostaddr=127.0.0.1 dbname=postgres user=postgres" in order to avoid to hard-code the PG password in the script. This is possible using the .pgpass file feature. It works when I'm logged under the postgres user. So my only problem left is how to launch this command under the postgres user tried to add --user=postgres in the call, like mentioned here but it does not work.

    Read the article

  • How does landscape calculate free memory?

    - by David Planella
    I'm trying to debug an OOM situation in an Ubuntu 12.04 server, and looking at the Memory graphs in Landscape, I noticed that there wasn't any serious memory usage spike spike. Then I looked at the output of the free command and I wasn't quite sure how both memory usage results relate to each other. Here's landscape's output on the server: $ landscape-sysinfo System load: 0.0 Processes: 93 Usage of /: 5.6% of 19.48GB Users logged in: 1 Memory usage: 26% IP address for eth0: - Swap usage: 2% Then I run the free command and I get: $ free -m total used free shared buffers cached Mem: 486 381 105 0 4 165 -/+ buffers/cache: 212 274 Swap: 255 7 248 I can understand the 2% swap usage, but where does the 26% memory usage come from?

    Read the article

  • Why there are many guest accounts?

    - by Radu Radeanu
    After I saw this answer, I realized that there are many guest accounts on my system: grep guest /etc/passwd guest-jzXeRx:x:117:127:Guest,,,:/tmp/guest-jzXeRx:/bin/false guest-l5dAPU:x:118:128:Guest,,,:/tmp/guest-l5dAPU:/bin/false guest-FdSAkw:x:119:129:Guest,,,:/tmp/guest-FdSAkw:/bin/false guest-eBU0cU:x:121:131:Guest,,,:/tmp/guest-eBU0cU:/bin/false Moreover, in this moment there is nobody logged as guest, but if somebody will login as guest, a new guest account is created - why, since there are already other guest accounts? After the new guest will log out, his account will be deleted. But why the other guest accounts remain? For what use/purpose? It doesn't mind me at all this guest account, but I want to know if it is okay to delete them manually.

    Read the article

  • The Business of Winning Innovation: An Exclusive Blog Series

    - by Kerrie Foy
    "The Business of Winning Innovation” is a series of articles authored by Oracle Agile PLM experts on what it takes to make innovation a successful and lucrative competitive advantage. Our customers have proven Agile PLM applications to be enormously flexible and comprehensive, so we’ve launched this article series to showcase some of the most fascinating, value-packed use cases. In this article by Keith Colonna, we kick-off the series by taking a look at the science side of innovation within the Consumer Products industry and how PLM can help companies innovate faster, cheaper, smarter. This article will review how innovation has become the lifeline for growth within consumer products companies and how certain companies are “winning” by creating a competitive advantage for themselves by taking a more enterprise-wide,systematic approach to “innovation”.   Managing the Science of Innovation within the Consumer Products Industry By: Keith Colonna, Value Chain Solution Manager, Oracle The consumer products (CP) industry is very mature and competitive. Most companies within this industry have saturated North America (NA) with their products thus maximizing their NA growth potential. Future growth is expected to come from either expansion outside of North America and/or by way of new ideas and products. Innovation plays an integral role in both of these strategies, whether you’re innovating business processes or the products themselves, and may cause several challenges for the typical CP company, Becoming more innovative is both an art and a science. Most CP companies are very good at the art of coming up with new innovative ideas, but many struggle with perfecting the science aspect that involves the best practice processes that help companies quickly turn ideas into sellable products and services. Symptoms and Causes of Business Pain Struggles associated with the science of innovation show up in a variety of ways, like: · Establishing and storing innovative product ideas and data · Funneling these ideas to the chosen few · Time to market cycle time and on-time launch rates · Success rates, or how often the best idea gets chosen · Imperfect decision making (i.e. the ability to kill projects that are not projected to be winners) · Achieving financial goals · Return on R&D investment · Communicating internally and externally as more outsource partners are added globally · Knowing your new product pipeline and project status These challenges (and others) can be consolidated into three root causes: A lack of visibility Poor data with limited access The inability to truly collaborate enterprise-wide throughout your extended value chain Choose the Right Remedy Product Lifecycle Management (PLM) solutions are uniquely designed to help companies solve these types challenges and their root causes. However, PLM solutions can vary widely in terms of configurability, functionality, time-to-value, etc. Business leaders should evaluate PLM solution in terms of their own business drivers and long-term vision to determine the right fit. Many of these solutions are point solutions that can help you cure only one or two business pains in the short term. Others have been designed to serve other industries with different needs. Then there are those solutions that demo well but are owned by companies that are either unable or unwilling to continuously improve their solution to stay abreast of the ever changing needs of the CP industry to grow through innovation. What the Right PLM Solution Should Do for You Based on more than twenty years working in the CP industry, I recommend investing in a single solution that can help you solve all of the issues associated with the science of innovation in a totally integrated fashion. By integration I mean the (1) integration of the all of the processes associated with the development, maintenance and delivery of your product data, and (2) the integration, or harmonization of this product data with other downstream sources, like ERP, product catalogues and the GS1 Global Data Synchronization Network (or GDSN, which is now a CP industry requirement for doing business with most retailers). The right PLM solution should help you: Increase Revenue. A best practice PLM solution should help a company grow its revenues by consolidating product development cycle-time and helping companies get new and improved products to market sooner. PLM should also eliminate many of the root causes for a product being returned, refused and/or reclaimed (which takes away from top-line growth) by creating an enterprise-wide, collaborative, workflow-driven environment. Reduce Costs. A strong PLM solution should help shave many unnecessary costs that companies typically take for granted. Rationalizing SKU’s, components (ingredients and packaging) and suppliers is a major opportunity at most companies that PLM should help address. A natural outcome of this rationalization is lower direct material spend and a reduction of inventory. Another cost cutting opportunity comes with PLM when it helps companies avoid certain costs associated with process inefficiencies that lead to scrap, rework, excess and obsolete inventory, poor end of life administration, higher cost of quality and regulatory and increased expediting. Mitigate Risk. Risks are the hardest to quantify but can be the most costly to a company. Food safety, recalls, line shutdowns, customer dissatisfaction and, worst of all, the potential tarnishing of your brands are a few of the debilitating risks that CP companies deal with on a daily basis. These risks are so uniquely severe that they require an enterprise PLM solution specifically designed for the CP industry that safeguards product information and processes while still allowing the art of innovation to flourish. Many CP companies have already created a winning advantage by leveraging a single, best practice PLM solution to establish an enterprise-wide, systematic approach to innovation. Oracle’s Answer for the Consumer Products Industry Oracle is dedicated to solving the growth and innovation challenges facing the CP industry. Oracle’s Agile Product Lifecycle Management for Process solution was originally developed with and for CP companies and is driven by a specialized development staff solely focused on maintaining and continuously improving the solution per the latest industry requirements. Agile PLM for Process helps CP companies handle all of the processes associated with managing the science of the innovation process, including: specification management, new product development/project and portfolio management, formulation optimization, supplier management, and quality and regulatory compliance to name a few. And as I mentioned earlier, integration is absolutely critical. Many Oracle CP customers, both with Oracle ERP systems and non-Oracle ERP systems, report benefits from Oracle’s Agile PLM for Process. In future articles we will explain in greater detail how both existing Oracle customers (like Gallo, Smuckers, Land-O-Lakes and Starbucks) and new Oracle customers (like ConAgra, Tyson, McDonalds and Heinz) have all realized the benefits of Agile PLM for Process and its integration to their ERP systems. More to Come Stay tuned for more articles in our blog series “The Business of Winning Innovation.” While we will also feature articles focused on other industries, look forward to more on how Agile PLM for Process addresses innovation challenges facing the CP industry. Additional topics include: Innovation Data Management (IDM), New Product Development (NPD), Product Quality Management (PQM), Menu Management,Private Label Management, and more! . Watch this video for more info about Agile PLM for Process

    Read the article

  • Cron job fails for any time other than default * * * * *

    - by Raghu
    On Ubuntu 11.10 (Oneiric Ocelot), my cron job run fine if I use the default * * * * * But if I want it to run at 17 hrs or any other time, it never runs. My settings are: 00 17 * * * wget http://www.abc.com/a.php I also tried: 00 17 * * * root wget http://www.abc.com/a.php I also tried specifying the path. There is a carriage return, and I'm logged in as root Here is my complete crontab: TZ=Australia/Sydney 22 7 * * * /usr/bin/wget http://www.abc.com/a.php 22 7 * * * /bin/date >> /tmp/date.txt ----the out put is as follws: root@Scrunch:~# sudo crontab -l -u root 55 12 * * * date >>/tmp/crontest.txt root@Scrunch:~# Why is the terminal displaying so many blank lines after outputting the crontab entries? do you suspect unnecessary carriage lines are given....And i have not given any entries any other cron spaces like .d,/daily eyc.,

    Read the article

  • Where did ULSTraceLog go to in the SharePoint 2010 Logging Database?

    The Logging Database is one of the many new concepts that will make the life of many SharePoint administrators quite a bit more enjoyable. In SharePoint 2007 the Unified Logging System (ULS) logged all of its data to text files, typically found on your SharePoint server in 12\LOGS. We still have that in SharePoint 2010, but besides those text files, ULS can also write the data to a database! The advantages are obvious: easy to query, one central location for all servers in the farm, easy to build...Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • Summit Time!

    - by Ajarn Mark Caldwell
    Boy, how time flies!  I can hardly believe that the 2011 PASS Summit is just one week away.  Maybe it snuck up on me because it’s a few weeks earlier than last year.  Whatever the cause, I am really looking forward to next week.  The PASS Summit is the largest SQL Server conference in the world and a fantastic networking opportunity thrown in for no additional charge.  Here are a few thoughts to help you maximize the week. Networking As Karen Lopez (blog | @DataChick) mentioned in her presentation for the Professional Development Virtual Chapter just a couple of weeks ago, “Don’t wait until you need a new job to start networking.”  You should always be working on your professional network.  Some people, especially technical-minded people, get confused by the term networking.  The first image that used to pop into my head was the image of some guy standing, awkwardly, off to the side of a cocktail party, trying to shmooze those around him.  That’s not what I’m talking about.  If you’re good at that sort of thing, and you can strike up a conversation with some stranger and learn all about them in 5 minutes, and walk away with your next business deal all but approved by the lawyers, then congratulations.  But if you’re not, and most of us are not, I have two suggestions for you.  First, register for Don Gabor’s 2-hour session on Tuesday at the Summit called Networking to Build Business Contacts.  Don is a master at small talk, and at teaching others, and in just those two short hours will help you with important tips about breaking the ice, remembering names, and smooth transitions into and out of conversations.  Then go put that great training to work right away at the Tuesday night Welcome Reception and meet some new people; which is really my second suggestion…just meet a few new people.  You see, “networking” is about meeting new people and being friendly without trying to “work it” to get something out of the relationship at this point.  In fact, Don will tell you that a better way to build the connection with someone is to look for some way that you can help them, not how they can help you. There are a ton of opportunities as long as you follow this one key point: Don’t stay in your hotel!  At the least, get out and go to the free events such as the Tuesday night Welcome Reception, the Wednesday night Exhibitor Reception, and the Thursday night Community Appreciation Party.  All three of these are perfect opportunities to meet other professionals with a similar job or interest as you, and you never know how that may help you out in the future.  Maybe you just meet someone to say HI to at breakfast the next day instead of eating alone.  Or maybe you cross paths several times throughout the Summit and compare notes on different sessions you attended.  And you just might make new friends that you look forward to seeing year after year at the Summit.  Who knows, it might even turn out that you have some specific experience that will help out that other person a few months’ from now when they run into the same challenge that you just overcame, or vice-versa.  But the point is, if you don’t get out and meet people, you’ll never have the chance for anything else to happen in the future. One more tip for shy attendees of the Summit…if you can’t bring yourself to strike up conversation with strangers at these events, then at the least, after you sit through a good session that helps you out, go up to the speaker and introduce yourself and thank them for taking the time and effort to put together their presentation.  Ideally, when you do this, tell them WHY it was beneficial to you (e.g. “Now I have a new idea of how to tackle a problem back at the office.”)  I know you think the speakers are all full of confidence and are always receiving a ton of accolades and applause, but you’re wrong.  Most of them will be very happy to hear first-hand that all the work they put into getting ready for their presentation is paying off for somebody. Training With over 170 technical sessions at the Summit, training is what it’s all about, and the training is fantastic!  Of course there are the big-name trainers like Paul Randall, Kimberly Tripp, Kalen Delaney, Itzik Ben-Gan and several others, but I am always impressed by the quality of the training put on by so many other “regular” members of the SQL Server community.  It is amazing how you don’t have to be a published author or otherwise recognized as an “expert” in an area in order to make a big impact on others just by sharing your personal experience and lessons learned.  I would rather hear the story of, and lessons learned from, “some guy or gal” who has actually been through an issue and came out the other side, than I would a trained professor who is speaking just from theory or an intellectual understanding of a topic. In addition to the three full days of regular sessions, there are also two days of pre-conference intensive training available.  There is an extra cost to this, but it is a fantastic opportunity.  Think about it…you’re already coming to this area for training, so why not extend your stay a little bit and get some in-depth training on a particular topic or two?  I did this for the first time last year.  I attended one day of extra training and it was well worth the time and money.  One of the best reasons for it is that I am extremely busy at home with my regular job and family, that it was hard to carve out the time to learn about the topic on my own.  It worked out so well last year that I am doubling up and doing two days or “pre-cons” this year. And then there are the DVDs.  I think these are another great option.  I used the online schedule builder to get ready and have an idea of which sessions I want to attend and when they are (much better than trying to figure this out at the last minute every day).  But the problem that I have run into (seems this happens every year) is that nearly every session block has two different sessions that I would like to attend.  And some of them have three!  ACK!  That won’t work!  What is a guy supposed to do?  Well, one option is to purchase the DVDs which are recordings of the audio and projected images from each session so you can continue to attend sessions long after the Summit is officially over.  Yes, many (possibly all) of these also get posted online and attendees can access those for no extra charge, but those are not necessarily all available as quickly as the DVD recording are, and the DVDs are often more convenient than downloading, especially if you want to share the training with someone who was not able to attend in person. Remember, I don’t make any money or get any other benefit if you buy the DVDs or from anything else that I have recommended here.  These are just my own thoughts, trying to help out based on my experiences from the 8 or so Summits I have attended.  There is nothing like the Summit.  It is an awesome experience, fantastic training, and a whole lot of fun which is just compounded if you’ll take advantage of the first part of this article and make some new friends along the way.

    Read the article

  • Fix for EF4 Profiler Issue Coming in next Cumulative Update

    - by Ajarn Mark Caldwell
    Hey!  What do you know?  Microsoft Connect really works! I was very happy this morning to open my email and find a notice from Umachandar on the SQL Programmability Team that they have created a fix for the Odd Profiler Results with EF4 issue that I wrote about last June.  Not only did I blog about it, but I logged an item to Connect with repro steps and sample code.  And now, they have announced that they have a fix for this problem and that it will be included in the next Cumulative Update for SQL Server 2008 R2. For those of you not running 2008 R2, or who prefer to wait for full Service Packs rather than install the latest Cumulative Updates, I also wrote about a workaround for the issue, as long as you do not require the Multiple Active Result Sets feature to be enabled. It is easy with Microsoft to get the feeling that you’re just shouting in the wind, and it is nice to get validation once in a while that they really are listening.

    Read the article

  • Smooth animation in Cocos2d for iOS

    - by MrDatabase
    I move a simple CCSprite around the screen of an iOS device using this code: [self schedule:@selector(update:) interval:0.0167]; - (void) update:(ccTime) delta { CGPoint currPos = self.position; currPos.x += xVelocity; currPos.y += yVelocity; self.position = currPos; } This works however the animation is not smooth. How can I improve the smoothness of my animation? My scene is exceedingly simple (just has one full-screen CCSprite with a background image and a relatively small CCSprite that moves slowly). I've logged the ccTime delta and it's not consistent (it's almost always greater than my specified interval of 0.0167... sometimes up to a factor of 4x). I've considered tailoring the motion in the update method to the delta time (larger delta = larger movement etc). However given the simplicity of my scene it's seems there's a better way (and something basic that I'm probably missing).

    Read the article

  • SQL SERVER – Weekly Series – Memory Lane – #053 – Final Post in Series

    - by Pinal Dave
    It has been a fantastic journey to write memory lane series for an entire year. This series gave me the opportunity to go back and see what I have contributed to this blog throughout the last 7 years. This was indeed fantastic series as this provided me the opportunity to witness how technology has grown throughout the year and how I have progressed in my career while writing this blog post. This series was indeed fantastic experience readers as many joined during the last few years and were not sure what they have missed in recent years. Let us continue with the final episode of the Memory Lane Series. Here is the list of selected articles of SQLAuthority.com across all these years. Instead of just listing all the articles I have selected a few of my most favorite articles and have listed them here with additional notes below it. Let me know which one of the following is your favorite article from memory lane. 2007 Get Current User – Get Logged In User Here is the straight script which list logged in SQL Server users. Disable All Triggers on a Database – Disable All Triggers on All Servers Question : How to disable all the triggers for a database? Additionally, how to disable all the triggers for all servers? For answer execute the script in the blog post. Importance of Master Database for SQL Server Startup I have received following questions many times. I will list all the questions here and answer them together. What is the purpose of Master database? Should our backup Master database? Which database is must have database for SQL Server for startup? Which are the default system database created when SQL Server 2005 is installed for the first time? What happens if Master database is corrupted? Answers to all of the questions are very much related. 2008 DECLARE Multiple Variables in One Statement SQL Server is a great product and it has many features which are very unique to SQL Server. Regarding feature of SQL Server where multiple variable can be declared in one statement, it is absolutely possible to do. 2009 How to Enable Index – How to Disable Index – Incorrect syntax near ‘ENABLE’ Many times I have seen that the index is disabled when there is a large update operation on the table. Bulk insert of very large file updates in any table using SSIS is usually preceded by disabling the index and followed by enabling the index. I have seen many developers running the following query to disable the index. 2010 List of all the Views from Database Many emails I received suggesting that they have hundreds of the view and now have no clue what is going on and how many of them have indexes and how many does not have an index. Some even asked me if there is any way they can get a list of the views with the property of Index along with it. Here is the quick script which does exactly the same. You can also include many other columns from the same view. Minimum Maximum Memory – Server Memory Options I was recently reading about SQL Server Memory Options over here. While reading this one line really caught my attention is minimum value allowed for maximum memory options. The default setting for min server memory is 0, and the default setting for max server memory is 2147483647. The minimum amount of memory you can specify for max server memory is 16 megabytes (MB). 2011 Fundamentals of Columnstore Index There are two kinds of storage in a database. Row Store and Column Store. Row store does exactly as the name suggests – stores rows of data on a page – and column store stores all the data in a column on the same page. These columns are much easier to search – instead of a query searching all the data in an entire row whether the data are relevant or not, column store queries need only to search a much lesser number of the columns. How to Ignore Columnstore Index Usage in Query In summary the question in simple words “How can we ignore using the column store index in selective queries?” Very interesting question – you can use I can understand there may be the cases when the column store index is not ideal and needs to be ignored the same. You can use the query hint IGNORE_NONCLUSTERED_COLUMNSTORE_INDEX to ignore the column store index. The SQL Server Engine will use any other index which is best after ignoring the column store index. 2012 Storing Variable Values in Temporary Array or Temporary List SQL Server does not support arrays or a dynamic length storage mechanism like list. Absolutely there are some clever workarounds and few extra-ordinary solutions but everybody can;t come up with such solution. Additionally, sometime the requirements are very simple that doing extraordinary coding is not required. Here is the simple case. Move Database Files MDF and LDF to Another Location It is not common to keep the Database on the same location where OS is installed. Usually Database files are in SAN, Separate Disk Array or on SSDs. This is done usually for performance reason and manageability perspective. Now the challenges comes up when database which was installed at not preferred default location and needs to move to a different location. Here is the quick tutorial how you can do it. UNION ALL and ORDER BY – How to Order Table Separately While Using UNION ALL If your requirement is such that you want your top and bottom query of the UNION resultset independently sorted but in the same result set you can add an additional static column and order by that column. Let us re-create the same scenario. Copy Data from One Table to Another Table – SQL in Sixty Seconds #031 – Video http://www.youtube.com/watch?v=FVWIA-ACMNo Reference: Pinal Dave (http://blog.sqlauthority.com)Filed under: Memory Lane, PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • Trouble switching to Dvorak on Ubuntu 12.04

    - by Dodgie
    I've decided to switch to Dvorak on my Ubuntu machine, but I'm having some trouble: First, I attempted to do this through the GUI -- System Settings - Keyboard Layout - add layout (plus sign) - English(programmer Dvorak). This didn't work at first, so I restarted my machine. It seemed to work at the password prompt (if only because QWERTY did not), but I couldn't get it to accept my password. I used the virtual keyboard option to enter my password with mouse clicks (the virtual keyboard was using the programmer's Dvorak Standard) and was able to get in that way. Once logged in, however, I was back to QWERTY. Second, I tried to switch on the command prompt -- $ loadkeys /usr/lib/kbd/keytables/dvorak.map The error message I received was "Couldn't get a file descriptor referring to the console " Does anyone know what I'm doing wrong? I've looked for a solution for these problems, but couldn't find anything.

    Read the article

  • Elmah

    - by csharp-source.net
    LMAH (Error Logging Modules and Handlers) is an application-wide error logging facility that is completely pluggable. It can be dynamically added to a running ASP.NET web application, or even all ASP.NET web applications on a machine, without any need for re-compilation or re-deployment. Once ELMAH has been dropped into a running web application and configured appropriately, you get the following facilities without changing a single line of your code: * Logging of nearly all unhandled exceptions. * A web page to remotely view the entire log of recoded exceptions. * A web page to remotely view the full details of any one logged exception. * In many cases, you can review the original yellow screen of death that ASP.NET generated for a given exception, even with customErrors mode turned off. * An e-mail notification of each error at the time it occurs. * An RSS feed of the last 15 errors from the log.

    Read the article

  • Is the Google Webmaster Tools verification temporary?

    - by Senseful
    When you add a site to Google Webmaster Tools, it asks you to verify it (e.g. via a <meta> tag). I verified a site a while ago, but when I logged in, I noticed that it isn't verified anymore. The history shows that it was verified 58 days ago, but then 30 days ago it tried and failed saying that "revierification failed". I'm not sure if this is a result of some setting I changed which required a reverification, or if Google Webmaster Tools periodically tries to verify the site. I was under the impression that the verification only happens once when you add the site, and then you can delete the <meta> tag. If this is not how it works, and it does reverify periodically, will it require a different <meta> tag value or can I keep the original one I used and never have to worry about it again?

    Read the article

  • PHPPgAdmin not working in Ubuntu 14.04

    - by Adam
    After a fresh install of Ubuntu 14.04, I've installed postgresql and phppgadmin from the Ubuntu repos. I am using the Apache2 webserver. PHP is working fine in the webserver, as is PHPMyAdmin, but PHPPgAdmin is not working. When I try to access it at localhost/phppgadmin, I get a 404 message. I've tried creating a symlink in /var/www to the phppgadmin content, but that doesn't seem to work. How do I fix this? EDIT: note that I am using a local proxy server (squid) through which I funnel all my online traffic. While this may be part of the problem, I would be surprised if it was, because I am still on the same machine as phppgadmin and the requests logged in the apache access log indicate that incoming requests for the page are coming from the local machine (which is allowed in the policies for phppgadmin, if I understand things correctly).

    Read the article

  • The Great Divorce

    - by BlackRabbitCoder
    I have a confession to make: I've been in an abusive relationship for more than 17 years now.  Yes, I am not ashamed to admit it, but I'm finally doing something about it. I met her in college, she was new and sexy and amazingly fast -- and I'd never met anything like her before.  Her style and her power captivated me and I couldn't wait to learn more about her.  I took a chance on her, and though I learned a lot from her -- and will always be grateful for my time with her -- I think it's time to move on. Her name was C++, and she so outshone my previous love, C, that any thoughts of going back evaporated in the heat of this new romance.  She promised me she'd be gentle and not hurt me the way C did.  She promised me she'd clean-up after herself better than C did.  She promised me she'd be less enigmatic and easier to keep happy than C was.  But I was deceived.  Oh sure, as far as truth goes, it wasn't a complete lie.  To some extent she was more fun, more powerful, safer, and easier to maintain.  But it just wasn't good enough -- or at least it's not good enough now. I loved C++, some part of me still does, it's my first-love of programming languages and I recognize its raw power, its blazing speed, and its improvements over its predecessor.  But with today's hardware, at speeds we could only dream to conceive of twenty years ago, that need for speed -- at the cost of all else -- has died, and that has left my feelings for C++ moribund. If I ever need to write an operating system or a device driver, then I might need that speed.  But 99% of the time I don't.  I'm a business-type programmer and chances are 90% of you are too, and even the ones who need speed at all costs may be surprised by how much you sacrifice for that.   That's not to say that I don't want my software to perform, and it's not to say that in the business world we don't care about speed or that our job is somehow less difficult or technical.  There's many times we write programs to handle millions of real-time updates or handle thousands of financial transactions or tracking trading algorithms where every second counts.  But if I choose to write my code in C++ purely for speed chances are I'll never notice the speed increase -- and equally true chances are it will be far more prone to crash and far less easy to maintain.  Nearly without fail, it's the macro-optimizations you need, not the micro-optimizations.  If I choose to write a O(n2) algorithm when I could have used a O(n) algorithm -- that can kill me.  If I choose to go to the database to load a piece of unchanging data every time instead of caching it on first load -- that too can kill me.  And if I cross the network multiple times for pieces of data instead of getting it all at once -- yes that can also kill me.  But choosing an overly powerful and dangerous mid-level language to squeeze out every last drop of performance will realistically not make stock orders process any faster, and more likely than not open up the system to more risk of crashes and resource leaks. And that's when my love for C++ began to die.  When I noticed that I didn't need that speed anymore.  That that speed was really kind of a lie.  Sure, I can be super efficient and pack bits in a byte instead of using separate boolean values.  Sure, I can use an unsigned char instead of an int.  But in the grand scheme of things it doesn't matter as much as you think it does.  The key is maintainability, and that's where C++ failed me.  I like to tell the other developers I work with that there's two levels of correctness in coding: Is it immediately correct? Will it stay correct? That is, you can hack together any piece of code and make it correct to satisfy a task at hand, but if a new developer can't come in tomorrow and make a fairly significant change to it without jeopardizing that correctness, it won't stay correct. Some people laugh at me when I say I now prefer maintainability over speed.  But that is exactly the point.  If you focus solely on speed you tend to produce code that is much harder to maintain over the long hall, and that's a load of technical debt most shops can't afford to carry and end up completely scrapping code before it's time.  When good code is written well for maintainability, though, it can be correct both now and in the future. And you know the best part is?  My new love is nearly as fast as C++, and in some cases even faster -- and better than that, I know C# will treat me right.  Her creators have poured hundreds of thousands of hours of time into making her the sexy beast she is today.  They made her easy to understand and not an enigmatic mess.  They made her consistent and not moody and amorphous.  And they made her perform as fast as I care to go by optimizing her both at compile time and a run-time. Her code is so elegant and easy on the eyes that I'm not worried where she will run to or what she'll pull behind my back.  She is powerful enough to handle all my tasks, fast enough to execute them with blazing speed, maintainable enough so that I can rely on even fairly new peers to modify my work, and rich enough to allow me to satisfy any need.  C# doesn't ask me to clean up her messes!  She cleans up after herself and she tries to make my life easier for me by taking on most of those optimization tasks C++ asked me to take upon myself.  Now, there are many of you who would say that I am the cause of my own grief, that it was my fault C++ didn't behave because I didn't pay enough attention to her.  That I alone caused the pain she inflicted on me.  And to some extent, you have a point.  But she was so high maintenance, requiring me to know every twist and turn of her vast and unrestrained power that any wrong term or bout of forgetfulness was met with painful reminders that she wasn't going to watch my back when I made a mistake.  But C#, she loves me when I'm good, and she loves me when I'm bad, and together we make beautiful code that is both fast and safe. So that's why I'm leaving C++ behind.  She says she's changing for me, but I have no interest in what C++0x may bring.  Oh, I'll still keep in touch, and maybe I'll see her now and again when she brings her problems to my door and asks for some attention -- for I always have a soft spot for her, you see.  But she's out of my house now.  I have three kids and a dog and a cat, and all require me to clean up after them, why should I have to clean up after my programming language as well?

    Read the article

  • How do I change the language via a terminal?

    - by McGee
    Using system settings I changed my language to Arabic and deleted the English language from the settings. Then the computer lagged and it logged out - now I can't log back in because the login is in Arabic. So is there a way to default my language via terminal, default the login password language, or login via terminal which is still in English. I only have access to guest and a terminal. I changed the pasword to something that could be translated into arabic http://www.psychocats.net/ubuntu/resetpassword - then loged in and used system settings to default.

    Read the article

  • How do I start VNC Server on boot?

    - by broiyan
    How do I create a system-wide autostart file? This would be on a cloud server running the desktop version of Maverick. I logged in as root and created an autostart file using System/Preferences/StartupApplications but it ended up in /root/.config/autostart and did not execute (as far as I can tell) upon rebooting. The autostart file is to invoke a bash script that invokes the VNC server. I copied the .desktop autostart file from /root/.config/autostart to /etc/xdg/autostart and rebooted. This did not seem to make a difference.

    Read the article

< Previous Page | 77 78 79 80 81 82 83 84 85 86 87 88  | Next Page >