Search Results

Search found 5414 results on 217 pages for 'regular'.

Page 118/217 | < Previous Page | 114 115 116 117 118 119 120 121 122 123 124 125  | Next Page >

  • Disable incremental search in firefox (and everything else!)

    - by Alan Curry
    All I find from googling "disable incremental search" is a bunch of people telling me how great incremental search is. It isn't. Firefox has the worst version of it, jumping around and making me lose my place because of a search I haven't even finished typing yet. I don't want the window scrolling up and down without my say-so. It would be nice if I could search with regular expressions, like text search has been done in every non-toy application since ed. But the jumpiness of the window is the overriding concern. How can this evil be defeated?

    Read the article

  • Controlling access to my API using SSH public key (not SSL)

    - by tharrison
    I have the challenge of implementing an API to be consumed by relatively non-technical clients -- pasting some sample code into their WordPress or homegrown PHP site is probably as much as we can ask. Asking them to install SSL on their servers ain't happening. So I am seeking a simple yet secure way to authenticate API clients. OAuth is the obvious solution, but I don't think it passes the "simple" test. Adding a client id and hashed secret as a parameter to the requests is closer -- it's not hard to do md5($secret . $client_id) or whatever the php would be. It seems to me that if client requests could use the same approach as SSH public keys (client gives us a key from their server(s) there should be some existing magic to make all of the subsequent transactions transparently work just as regular HTTP API requests. I am still working this out (obviously :-), so if I am being an idiot, it would be nice to know why. Thanks!

    Read the article

  • How to disable screens turn off Windows 8.1

    - by Warpzit
    So recently my computer started turning my screen monitors off after a while without activity, if I move the mouse or hit the keyboard they turn back on. I just want them to stay on. My System is windows 8.1 64 bit, I have 2 screens with mini display port and 1 television with hdmi which often is in regular television mode when this issue happens. Now before you jump to conclusions this is what I've tryed: Disabled screen saver Power saving settings to high with following set: Turn of the display: Never Put the computer to sleep: Never Turn off hard disk after: Settings: Never Sleep: Sleep after: Settings: Never Display: Turn off display after: Settings: Never Display: Enable adaptive brightness: Settings: Off Multimedia settings: When sharing media: Settings: Prevent idling to sleep. Any pointers that get me in the right direction will be greatly appreciated.

    Read the article

  • Windows VPS/Cloud Host Recommendations?

    - by user18937
    As Hosting.com are no longer offerring Windows VPS accounts, we are looking for a new US based provider. Looking for something that offers standard Windows IIS hosting for a Dotnetnuke portal site with SQL Server Express or Web. Basic managed services for OS updates as well as regular backups required. Good level of support and solid uptime is critical. Budget in the $150-$200 month range but flexible depeding on quality and services offerred. Anyone have any suggestions and good feedback they can share? Currently looking at Jodohost as an option but would like some other possibilities as their support can be suspect at times we have found. A cloud solution in same range would also be an option.

    Read the article

  • How to report spam to blacklists

    - by hayalci
    Is there a central place to report spam to various blacklists ? I regularly report to spamcop, but I do not see the addresses I reported as listed. (I guess nobody else bothers with my regular spammer. After being frustrated with bayes and spammcop, I blocked its /24 subnet) Spamcop is only a single service. I want to make the spammer known to a large number of services, and hopefully blocked by many of them. I looked for some other blacklists to report to, but the ones I looked do not consider user submissions (or they hide it well)

    Read the article

  • Variable for the suffix of $request_uri that didn't match the location block prefix

    - by hsivonen
    Suppose I want to move an /images/ directory to an images host so that what was before http://example.org/images/foo.png becomes http://images.example.org/foo.png. If I do: location /images/ { return 301 http://images.example.org$request_uri; }, the result is a redirect to http://images.example.org/images/foo.png which isn't what I want. An older question has an answer that suggests using a regexp location, but that seems like an overkill. Is there really no way to refer to $request_uri with the location prefix chopped off without using regular expressions? Seems like an obvious feature to have.

    Read the article

  • Everything You Ever Wanted to Know about Mod_Rewrite Rules but Were Afraid to Ask?

    - by Kyle Brandt
    How can I become an expert at writing mod_rewrite rules? What is the fundamental format and structure of mod_rewrite rules? What form/flavor of regular expressions do I need to have a solid grasp of? What are the most common mistakes/pitfalls when writing rewrite rules? What is a good method for testing and verifying mod_rewrite rules? Are there SEO or performance implications of mod_rewrite rules I should be aware of? Are there common situations where mod_rewrite might seem like the right tool for the job but isn't? What are some common examples?

    Read the article

  • Touchscreen on KDE and Ubuntu?

    - by The Quantum Physicist
    I just bought a Lenovo Yoga 2 Pro... I liked the activity of the touchscreen on Windows, and it makes sense as it does on my smart phone. However, I'm not a regular windows user, so I installed Kubuntu 14.04, and everything looks fine, except that the activity of the touchscreen is so silly that it's useless. Why? Because all the touchscreen does is a single mouse with left click. For example, if I touch the screen for a relatively long time, I don't get the effect of a right click. How do I configure the touchscreen properly to get the activity expected on Ubuntu and KDE? Thanks for any efforts.

    Read the article

  • Can an SSD notify the hosting OS that its wear level is getting high?

    - by Tony_Henrich
    I read a lot about SSDs and I am interested in them for server use. My biggest concern is their reliability. A lot of writes shortens their life span. I can mitigate this problem if I can run some kind of diagnostics on a regular basis on the SSD or if the SSD can automatically warn the OS that its reliability is reaching a critical level. Think of this as S.M.A.R.T or software like SpinRite for SSDs. Does anything I mentioned exist now? Which kind/brand of SSD does this? I don't mind swapping out a tired SSD for a newer one once a while. I am pretty sure that SSDs life is calculated in years and not in few months? For me, the improved performance will pay for the SSD over and over. I am planning to use plenty of RAM as well.

    Read the article

  • Set up homeserver with single IP to host multiple sites on Ubuntu [closed]

    - by Ortix92
    I am trying to set up my homeserver so it can function as a regular server one would rent. I am running Ubuntu 12.04 LTS with openpanel. I have a single static IP address. I am used to having two addresses and pointing them to NS1.domain.tld and NS2.domain.tld and setting up the propper DNS records. I would also like to mention I am somewhat new to DNS zones. Either way, how would I go about setting this up correctly (in openpanel) with just a single IP address if possible at all? I have also read about free solutions online, but I would like to keep everything secure and private so other people can't peer into my data somehow. Thanks!

    Read the article

  • SSH logins failing before success

    - by Vincent
    I am running Ubuntu 12.04 Server, updated, to run a webserver on Tomcat 7. I have about 1000 clients that are very very often using an RSYNC program to sync some file with this server. Those RSync are using SSH with a certain user to open connections on the server. The result is that my server is, as normal, full of connections by the same user. About 5 connections per 1 second every day any time. Then, when I try to open a regular SSH connection with my Putty client, the connection fails before login saying "Server unexpectedly closed network connection", about 6 times for 10 attemps, anbd for 4 attemps out of 10, it works normally and I am able to login as any user. Is there a overload of connections here? The server statistics are very calm saying less then 40% of network usage and less of 2% CPU. How can I improve this? Thank you for any help. V.

    Read the article

  • How many xml http requests is too much for a pc to handle?

    - by Uri
    I'm running mediawiki on an apache on a regular pc running vista (don't know the specific specs, but I'm guessing at least duo core 2 2 giga hertz processor, broadband connection (500 kb/s at least, probably 1 mega). I want to use the MediaWiki api to send a lot of requests to this server. Most of the time the requests will be sent through LAN (but sometimes through the internet). I'm talking thousands of requests every few seconds at worst case. (A lot of these requests may repeat themselves, I guess some sort of cache would help) Will the server handle this, or do I need a stronger/dedicated computer? (I'm not looking for specific yes/no, but just want to get an idea as to what configuration of computer will support how many request per second) Thanks

    Read the article

  • DNS lookup fails when forwarding to subdomain

    - by Kitaro
    In order to migrate to a new mailserver with little dns problems/downtime, I have set up a second postfix that is currently accessible on a subdomain mx record, eg. the main postfix accepts mail for [email protected] while the second postfix also accepts mail for [email protected]. I added a forwarding rule to postfix saying that postfix should forward mail destined for [email protected] to [email protected] (for regular local delivery) and to [email protected]. Local delivery still works as expected, but when trying forward the mail to the new mx, postfix appeds the domain part at the end of the forwarding address, resulting in [email protected], which of course fails and the mail bounces. Why does postfix mess with the alias name in that way and how can I turn that of?

    Read the article

  • Why are browsers so heavy?

    - by Kaivosukeltaja
    Back in 1998 I had a computer with 233MHz Pentium MMX CPU and a GFX card with no 3D acceleration. It was able to run games like Quake II at a decent FPS rate. My current computer has tons more performance and a mid-class GPU, yet struggles to reach 20 FPS when rendering a single model inside a skybox with WebGL. Even regular pages with lots of 2D CSS animations bring many modern computers to their metaphorical knees. As a web developer I understand there's a lot going on in a web page but not what makes it that heavy. Modern browsers compile JavaScript to CPU native machine code before running it and rendering into a canvas element shouldn't trigger DOM rebuilds so theoretically it should be a lot faster than it is. What am I missing here and is it possible to avoid or minimize whatever is making the browsers slow to build more efficient websites?

    Read the article

  • RAID 1 in ubuntu 12.04

    - by Bavly Hanna
    Right now I have a small file server to which I have loaded ubuntu 12.04 desktop on a small 160gb harddrive. This harddrive is the primary drive from which the OS boots. I want to move all my data to my file server so it can be shared on the network (contain in 2x2TB harddrives in my desktop. The 2TB drivers are in raid 1 (hardware). I simply want to move them to the file server and set it up so that they are in software raid 1. If at all possible I'd like to be able to do this without losing any info on the drives. I've searched around and the guides i find describe raiding drives for boot driver, but these wouldn't be boot drives just regular storage drives. If someone could tell me how to perform this or point me in the right direction it would be much appreciated.

    Read the article

  • Sync two external harddrives?

    - by acidzombie24
    A little mishap happened earlier today and i am thinking i should have a copy of my external harddrive since 10% of it is very valuable. What is the best solution to keep two external harddrive in sync? i'll probably use one as regular and maybe use the other only to copy data. The easiest way to keep it in sync is to clear one drive and copy the other but 1T of data will take a long time. Whats a good existing app that will keep them in sync? freeware preferred.

    Read the article

  • Prevent acccess to the C drive

    - by Jenko
    Is it possible to prevent regular users from accessing the C drive via Windows Explorer? they should be allowed to execute certain programs. This is to ensure that employees cannot steal or copy out proprietary software even though they should be able to execute it. One way would be to change the option in windows Group Policy and set the "shell" to something other than "explorer.exe". I'm looking for a similar windows setting that just hides the C drive or otherwise prevents trivial access. This is for Windows XP/7.

    Read the article

  • Everything You Ever Wanted to Know about Mod_Rewrite Rules but Were Afraid to Ask?

    - by Kyle Brandt
    How can I become an expert at writing mod_rewrite rules? What is the fundamental format and structure of mod_rewrite rules? What form/flavor of regular expressions do I need to have a solid grasp of? What are the most common mistakes/pitfalls when writing rewrite rules? What is a good method for testing and verifying mod_rewrite rules? Are there SEO or performance implications of mod_rewrite rules I should be aware of? Are there common situations where mod_rewrite might seem like the right tool for the job but isn't? What are some common examples?

    Read the article

  • Is there a way to redirect certain URLs to specific web browsers in Linux?

    - by jraxxo
    I'm using Chrome as my default browser in Ubuntu 12.10. I need to use Firefox for business purposes (certain websites pertaining to my work only work with Firefox). Is there a way to force Ubuntu to use Firefox for certain types of URLs (maybe as defined by a regular expression) while maintaining Chrome as my default browser for all my other tasks? Perhaps as a shell script running in the background? I'd like this to work system-wide, covering links from Chrome itself as well as PDFs/ODTs, etc. I have searched for solutions, but I couldn't find anything besides OpenWith, a Firefox extension which adds a button to open certain links in other browsers which would again require me to open Firefox beforehand, which does not help me at all. Does anyone have any ideas? Something like Choosy for Linux?

    Read the article

  • Running a cronjob

    - by Ed01
    've been puzzling over cronjobs for the last few hours. I've read documentation and examples. I understand the basics and concepts, but haven't gotten anything to work. So I would appreciate some help with this total noob dilemma. The ultimate goal is to schedule the execution of a django function every day. Before I get that far, I want to know that I can schedule any old script to run, first once, then on a regular basis. So I want to: 1) Write a simple script (perhaps a bash script) that will allow me to determine that yes, it did indeed run successfully, or that it failed. 2) schedule this script to run at the top of the hour I tried writing a bash script that simple output some text to the terminal: #!/bin/bash echo "The script ran" Then I dropped this into a .txt file MAILTO = *****.******@gmail.com 05 * * * * /home/vadmin/development/test.sh But nothing happened. I'm sure I did many things wrong. Where do I start to fix all of this?

    Read the article

  • mac os x default editor for .dotsystemfiles

    - by jasonkuhrt
    This is a seemingly simple question but I can't find an answer so far despite search quite a bit. I'd like that when I open a .dotsystemfile in finder (i.e. .htaccess or .vimrc) it opens in a different editor than textedit. Doing the regular change-all in the info panel won't do the trick as it gives the following error: " An error occurred while changing the application that opens “.vimrc” because not enough information is available. Do you want to open “.vimrc” with “MacVim.app”? " This isn't a huge issue but it is like a small splinter that I've love removed. Thanks for any helpful information.

    Read the article

  • Can I use applescript to click buttons in background?

    - by Giorgio
    Sorry for this generic and probably bad-written question. I've never programmed in applescript, but I'm quite familiar with other coding language. I'm in the need of clicking on 2 sequential button inside the lobby of a software (when you click the first a popup appears and we should click 'ok'). However things are a little bit more complicated then this because: 1) the lobby of this program isn't in foreground: it's covered by other windows opened. (I don't have experience so I don't know if this represent a problem). 2) there should be a timer and the program should click this button at regular intervals. Is this feasible with applescript?

    Read the article

  • Command-line tool to search for file names on offline backup drives

    - by halloleo
    I am looking for an open-source (command-line) tool to register and search all my (backup) drives on a file name level. I want to search for file and folder names preferably written as regular expressions or file glob patterns. The external drives contain just normal HFS and NTFS filesystems. The backups are done via direct file copy. Requirement is that the tool compiles on OS X and works without each of the drives attached, but rather pointing me to the drive in case a drive contains a file with the pattern I searched for. At the moment I use a hand-knit script solution with locate databases, one for each external backup drive, but this is rather cumbersome, because locate itself can accesses only one database at a time and does not contain any management system for all the indices/databases. Are there any other tools out there for this?

    Read the article

  • HP DL380 Losing Drive Array

    - by jidl
    I have an HP Proliant DG380 G7 dropping one of it's arrays every hour, on the hour, for 2-5 minutes. The OS is SBS 2011 Standard, the servers runs Exchange, DC, files & Trend WFBS 8. I can watch the D Drive disappear for the duration of the problem - then it just comes back up and all is well again. There is no loss of network connectivity, although the mapped drives also disappear. We thought it might be to do with Sharepoint / VSS writers failing but it looks as though this is a symptom rather than cause. It survives a reboot. Any ideas as to what could be running on a regular schedule like this?

    Read the article

  • SQL SERVER – Shrinking NDF and MDF Files – Readers’ Opinion

    - by pinaldave
    Previously, I had written a blog post about SQL SERVER – Shrinking NDF and MDF Files – A Safe Operation. After that, I have written the following blog post that talks about the advantage and disadvantage of Shrinking and why one should not be Shrinking a file SQL SERVER – SHRINKFILE and TRUNCATE Log File in SQL Server 2008. On this subject, SQL Server Expert Imran Mohammed left an excellent comment. I just feel that his comment is worth a big article itself. For everybody to read his wonderful explanation, I am posting this blog post here. Thanks Imran! Shrinking Database always creates performance degradation and increases fragmentation in the database. I suggest that you keep that in mind before you start reading the following comment. If you are going to say Shrinking Database is bad and evil, here I am saying it first and loud. Now, the comment of Imran is written while keeping in mind only the process showing how the Shrinking Database Operation works. Imran has already explained his understanding and requests further explanation. I have removed the Best Practices section from Imran’s comments, as there are a few corrections. Comments from Imran - Before I explain to you the concept of Shrink Database, let us understand the concept of Database Files. When we create a new database inside the SQL Server, it is typical that SQl Server creates two physical files in the Operating System: one with .MDF Extension, and another with .LDF Extension. .MDF is called as Primary Data File. .LDF is called as Transactional Log file. If you add one or more data files to a database, the physical file that will be created in the Operating System will have an extension of .NDF, which is called as Secondary Data File; whereas, when you add one or more log files to a database, the physical file that will be created in the Operating System will have the same extension as .LDF. The questions now are, “Why does a new data file have a different extension (.NDF)?”, “Why is it called as a secondary data file?” and, “Why is .MDF file called as a primary data file?” Answers: Note: The following explanation is based on my limited knowledge of SQL Server, so experts please do comment. A data file with a .MDF extension is called a Primary Data File, and the reason behind it is that it contains Database Catalogs. Catalogs mean Meta Data. Meta Data is “Data about Data”. An example for Meta Data includes system objects that store information about other objects, except the data stored by the users. sysobjects stores information about all objects in that database. sysindexes stores information about all indexes and rows of every table in that database. syscolumns stores information about all columns that each table has in that database. sysusers stores how many users that database has. Although Meta Data stores information about other objects, it is not the transactional data that a user enters; rather, it’s a system data about the data. Because Primary Data File (.MDF) contains important information about the database, it is treated as a special file. It is given the name Primary Data file because it contains the Database Catalogs. This file is present in the Primary File Group. You can always create additional objects (Tables, indexes etc.) in the Primary data file (This file is present in the Primary File group), by mentioning that you want to create this object under the Primary File Group. Any additional data file that you add to the database will have only transactional data but no Meta Data, so that’s why it is called as the Secondary Data File. It is given the extension name .NDF so that the user can easily identify whether a specific data file is a Primary Data File or a Secondary Data File(s). There are many advantages of storing data in different files that are under different file groups. You can put your read only in the tables in one file (file group) and read-write tables in another file (file group) and take a backup of only the file group that has read the write data, so that you can avoid taking the backup of a read-only data that cannot be altered. Creating additional files in different physical hard disks also improves I/O performance. A real-time scenario where we use Files could be this one: Let’s say you have created a database called MYDB in the D-Drive which has a 50 GB space. You also have 1 Database File (.MDF) and 1 Log File on D-Drive and suppose that all of that 50 GB space has been used up and you do not have any free space left but you still want to add an additional space to the database. One easy option would be to add one more physical hard disk to the server, add new data file to MYDB database and create this new data file in a new hard disk then move some of the objects from one file to another, and put the file group under which you added new file as default File group, so that any new object that is created gets into the new files, unless specified. Now that we got a basic idea of what data files are, what type of data they store and why they are named the way they are, let’s move on to the next topic, Shrinking. First of all, I disagree with the Microsoft terminology for naming this feature as “Shrinking”. Shrinking, in regular terms, means to reduce the size of a file by means of compressing it. BUT in SQL Server, Shrinking DOES NOT mean compressing. Shrinking in SQL Server means to remove an empty space from database files and release the empty space either to the Operating System or to SQL Server. Let’s examine this through an example. Let’s say you have a database “MYDB” with a size of 50 GB that has a free space of about 20 GB, which means 30GB in the database is filled with data and the 20 GB of space is free in the database because it is not currently utilized by the SQL Server (Database); it is reserved and not yet in use. If you choose to shrink the database and to release an empty space to Operating System, and MIND YOU, you can only shrink the database size to 30 GB (in our example). You cannot shrink the database to a size less than what is filled with data. So, if you have a database that is full and has no empty space in the data file and log file (you don’t have an extra disk space to set Auto growth option ON), YOU CANNOT issue the SHRINK Database/File command, because of two reasons: There is no empty space to be released because the Shrink command does not compress the database; it only removes the empty space from the database files and there is no empty space. Remember, the Shrink command is a logged operation. When we perform the Shrink operation, this information is logged in the log file. If there is no empty space in the log file, SQL Server cannot write to the log file and you cannot shrink a database. Now answering your questions: (1) Q: What are the USEDPAGES & ESTIMATEDPAGES that appear on the Results Pane after using the DBCC SHRINKDATABASE (NorthWind, 10) ? A: According to Books Online (For SQL Server 2000): UsedPages: the number of 8-KB pages currently used by the file. EstimatedPages: the number of 8-KB pages that SQL Server estimates the file could be shrunk down to. Important Note: Before asking any question, make sure you go through Books Online or search on the Google once. The reasons for doing so have many advantages: 1. If someone else already has had this question before, chances that it is already answered are more than 50 %. 2. This reduces your waiting time for the answer. (2) Q: What is the difference between Shrinking the Database using DBCC command like the one above & shrinking it from the Enterprise Manager Console by Right-Clicking the database, going to TASKS & then selecting SHRINK Option, on a SQL Server 2000 environment? A: As far as my knowledge goes, there is no difference, both will work the same way, one advantage of using this command from query analyzer is, your console won’t be freezed. You can do perform your regular activities using Enterprise Manager. (3) Q: What is this .NDF file that is discussed above? I have never heard of it. What is it used for? Is it used by end-users, DBAs or the SERVER/SYSTEM itself? A: .NDF File is a secondary data file. You never heard of it because when database is created, SQL Server creates database by default with only 1 data file (.MDF) and 1 log file (.LDF) or however your model database has been setup, because a model database is a template used every time you create a new database using the CREATE DATABASE Command. Unless you have added an extra data file, you will not see it. This file is used by the SQL Server to store data which are saved by the users. Hope this information helps. I would like to as the experts to please comment if what I understand is not what the Microsoft guys meant. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: Readers Contribution, Readers Question, SQL, SQL Authority, SQL Query, SQL Scripts, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

< Previous Page | 114 115 116 117 118 119 120 121 122 123 124 125  | Next Page >