Search Results

Search found 5922 results on 237 pages for 'drop shadows'.

Page 117/237 | < Previous Page | 113 114 115 116 117 118 119 120 121 122 123 124  | Next Page >

  • Node.js + Express.js. How to RENDER less css?

    - by Paden
    Hello all, I am unable to render less css in my express workspace. Here is my current configuration (my css/less files go in 'public/stylo/'): app.configure(function() { app.set('views' , __dirname + '/views' ); app.set('partials' , __dirname + '/views/partials'); app.set('view engine', 'jade' ); app.use(express.bodyDecoder() ); app.use(express.methodOverride()); app.use(express.compiler({ src: __dirname + '/public/stylo', enable: ['less']})); app.use(app.router); app.use(express.staticProvider(__dirname + '/public')); }); Here is my main.jade file: !!! html(lang="en") head title Yea a title link(rel="stylesheet", type="text/css", href="/stylo/main.less") link(rel="stylesheet", href="http://fonts.googleapis.com/cssfamily=Droid+Sans|Droid+Sans+Mono|Ubuntu|Droid+Serif") script(src="https://ajax.googleapis.com/ajax/libs/jquery/1.4.4/jquery.min.js") script(src="https://ajax.googleapis.com/ajax/libs/jqueryui/1.8.7/jquery-ui.min.js") body!= body here is my main.less css: @import "goodies.css"; body { .googleFont; background-color : #000000; padding : 20px; margin : 0px; > .header { border-bottom : 1px solid #BBB; background-color : #f0f0f0; margin : -25px -25px 30px -25px; /* important */ color : #333; padding : 15px; font-size : 18pt; } } AND here is my goodies.less css: .rounded_corners(@radius: 10px) { -moz-border-radius : @radius; -webkit-border-radius: @radius; border-radius : @radius; } .shadows(@rad1: 0px, @rad2: 1px, @rad3: 3px, @color: #999) { -webkit-box-shadow : @rad1 @rad2 @rad3 @color; -moz-box-shadow : @rad1 @rad2 @rad3 @color; box-shadow : @rad1 @rad2 @rad3 @color; } .gradient (@type: linear, @pos1: left top, @pos2: left bottom, @color1: #f5f5f5, @color2: #ececec) { background-image : -webkit-gradient(@type, @pos1, @pos2, from(@color1), to(@color2)); background-image : -moz-linear-gradient(@color1, @color2); } .googleFont { font-family : 'Droid Serif'; } Cool deal. Now: I have installed less via npm and I had heard from another post that @imports should reference the .css not the .less. In any case, I have tried the combinations of switching .less for .css in the jade and less files with no success. If you can help or have the solution I'd greatly appreciate it. Note: The jade portion works fine if I enter any ol' .css. Note2: The less compiles if I use lessc via command line.

    Read the article

  • Oracle performance problem

    - by jreid42
    We are using an Oracle 11G machine that is very powerful; has redundant storage etc. It's a beast from what I have been told. We just got this DB for a tool that when I first came on as a coop had like 20 people using, now its upwards of 150 people. I am the only one working on it :( We currently have a system in place that distributes PERL scripts across our entire data center essentially giving us a sort of "grid" computing power. The Perl scripts run a sort of simulation and report back the results to the database. They do selects / inserts. The load is not very high for each script but it could be happening across 20-50 systems at the same time. We then have multiple data centers and users all hitting the same database with this same approach. Our main problem with this is that our database is getting overloaded with connections and having to drop some. We sometimes have upwards of 500 connections. These are old perl scripts and they do not handle this well. Essentially they fail and the results are lost. I would rather avoid having to rewrite a lot of these as they are poorly written, and are a headache to even look at. The database itself is not overloaded, just the connection overhead is too high. We open a connection, make a quick query and then drop the connection. Very short connections but many of them. The database team has basically said we need to lower the number of connections or they are going to ignore us. Because this is distributed across our farm we cant implement persistent connections. I do this with our webserver; but its on a fixed system. The other ones are perl scripts that get opened and closed by the distribution tool and thus arent always running. What would be my best approach to resolving this issue? The scripts themselves can wait for a connection to be open. They do not need to act immediately. Some sort of queing system? I've been suggested to set up a few instances of a tool called "SQL Relay". Maybe one in each data center. How reliable is this tool? How good is this approach? Would it work for what we need? We could have one for each data center and relay requests through it to our main database, keeping a pipeline of open persistent connections? Does this make sense? Is there any other suggestions you can make? Any ideas? Any help would be greatly appreciated. Sadly I am just a coop student working for a very big company and somehow all of this has landed all on my shoulders (there is literally nobody to ask for help; its a hardware company, everybody is hardware engineers, and the database team is useless and in India) and I am quite lost as what the best approach would be? I am extremely overworked and this problem is interfering with on going progress and basically needs to be resolved as quickly as possible; preferably without rewriting the whole system, purchasing hardware (not gonna happen), or shooting myself in the foot. HELP LOL!

    Read the article

  • Slow git clone and fetch

    - by EtienneT
    I setuped gitosis on a linux server following this tutorial: http://scie.nti.st/2007/11/14/hosting-git-repositories-the-easy-and-secure-way We are using git on our windows machines with TortoiseGit and msysgit. Pushing changes to the server is pretty fast, but when we want to clone or fetch changes from the remote server, it begins really fast (800k/s) and then drop pretty fast to around 3 to 30k/s and it can take forever to update. git-pull for small update is fast, but as soon as we have to download something of more than a few MB, it is slow. We are switching from SVN to git and this is holding us back from using git full time. Thanks!

    Read the article

  • Make PATH variable changes permanent on openSuse

    - by Marlon
    Okay, so I'm trying to do something that should be rather simple but for some reason I can't quite seem to make it work. All I simply want to do is add a path to the PATH environment variable in openSuse. So far, I've edited the following line in /etc/default/su : PATH=/usr/local/bin:/bin:/usr/bin with this line : PATH=/usr/local/bin:/bin:/usr/bin:/usr/local/php/bin:/usr/local/mysql/bin Basically, all I want to do is have access to php and mysqld regardless of how I log in directly from the command prompt without having to type trailing /usr/local/php/bin/ every time. Am I even editing the right file? I'm a bit of a Linux newbie and to achieve something as trivial as this is eluding me. Server gods out there, drop be a crumb, please? :-)

    Read the article

  • Filezilla FTP Server Ports - Active Connections

    - by Brian Webster
    I have been obtaining errors like below because I did not specify enough ports for the active FTP connections. Response: 150 Opening data channel for directory list. Response: 425 Can't open data connection.Error: Failed to retrieve directory listing Things seem to work nicely with limited ports, but when I perform actions that cause very rapid short-lived connections, something like 20-30% of the connections drop with the error above. I started with ports 50000-50100. When I opened up to 50000-52000, the errors disappeared. Why did this fix my problem? I would like to understand why adding ports fixed it. I have a suspicion that ports become "locked down" for a few moments surrounding when they are used in a connection. If connections are happening so rapidly, there may be no ports available, thus the above error. Can anybody confirm?

    Read the article

  • Home network with Windows 7 as router

    - by Michael
    Background: I have tried to use routers, but so far all of them can't handle the bandwidth, number of connections eventually limited by the hardware resources, so overall the home routers are decreasing the internet speed. I went through DD-WRT and stuff like that. Question: What I want is to use my Windows7 PC as router. It has 2 LAN cards. I'm going to connect to this router another desktop 2 pcs and notebook through wireless router. The main question is what is the most efficient way to turn this Windows7 box(and I need Windows for native NTFS support) into router with NAT/Routing/Firewall functionality? Is there any routing software recommended for this purpose or I should just use windows native "Internet Sharing"? I'm going to run SIP phones in the LAN, so I need friendly NAT(Full cone perhaps). Also I'm going to have FTP server on that Windows7 "server" PC. As firewall I'm thinking about Comodo. Need to drop all incoming, unless explicitly allowed.

    Read the article

  • Correct way to treat iptables init failure?

    - by chris_l
    Hi, I'm initializing my iptables rules via /etc/network/if-pre-up.d/iptables, using iptables-restore. This works fine, but I'm a bit worried about what would happen, if that script failed for some reason (maybe the saved iptables file is corrupt or whatever). In case the script failed, I'd like to: Start up my network interfaces without any iptables rules Start up OpenSSH server But not any other services like web server, ... (and maybe stop running instances) Is there a good canonical way to do that? Going into a lower init stage? - I haven't done that in a long time, and I think a lot about init has changed in recent years (?) - which stage should I drop to, and would the OpenSSH server and my network interfaces still run? Thanks Chris (On Debian Lenny)

    Read the article

  • IP6tables blocks INPUT? can't connect with youtube API

    - by klaas
    I thought to have a simple ipv6 firewall, but it turned out to be hell. Somehow I really can't connect with any ipv6 from my machine unless I set INPUT Policy to ACCEPT. Below my current ip6tables ip6tables -L Chain INPUT (policy DROP) target prot opt source destination ACCEPT all anywhere anywhere state RELATED,ESTABLISHED ACCEPT ipv6-icmp anywhere anywhere ACCEPT tcp anywhere anywhere tcp dpt:http ACCEPT tcp anywhere anywhere tcp dpt:https Chain FORWARD (policy ACCEPT) target prot opt source destination Chain OUTPUT (policy ACCEPT) target prot opt source destination If I try to connect with any ipv6 adres it doesn't work? telnet gdata.youtube.com 80 Trying 2a00:1450:4013:c00::76... OR telnet gdata.youtube.com 443 Trying 2a00:1450:4013:c00::76... When I set: ip6tables -P INPUT ACCEPT It works.. but then.. well then everything is open? what is going on? Help?

    Read the article

  • Setting up Pure-FTPd with admin/user permissions for same directory

    - by modulaaron
    I need to set up 2 Pure-FTPd accounts - ftpuser and ftpadmin. Both will have access to a directory that contains 2 subdirectories - upload and downlaod. The permissions criteria needs to be as follows: ftpuser can upload to /upload but cannot view the contents (blind drop). ftpuser can download from /download but cannot write to it. ftpadmin has full read/write permissions to both, including file deletion Currently, the first two are not a problem - disabling /upload read access and /download write access for ftpuser did the job. The problem is that when a file is uploaded by ftpuser, it's permissions are set to 644, meaning that user ftpadmin can only read it (note that all FTP directories are chown'd to ftpuser:ftpadmin). How can I give ftpadmin the power he so rightfully deserves?

    Read the article

  • Protect Section in Word without limiting formatting in unprotected sections

    - by grom
    Steps to create protected section (in Word 2003): Insert - Break... Choose Section break, Continuous Tools - Protect Document... Enable 'Allow only this type of editing in the document' in editing restrictions In the drop down select 'Filling in forms' Click on 'Select sections...' and uncheck the unprotected sections (eg. Section 2) Click 'Yes, Start Enforcing Protection' and optionally set a password. Now go to the unprotected section and in the Format menu options like 'Bullets and Numbering...' and 'Borders and Shading...' are greyed out. How can you protect a section without limiting the features that can be used in the unprotected section?

    Read the article

  • Archiving to Tape

    - by Bruno
    This is not about backups, this is about archiving. For arguments sake lets say I have 2TB 7z file that I would like to archive to tape. I have 4 LTO-5 tapes ( 1.5TB each ). This may be a stupid question but what set up would I need that would allow me to drag and drop those files directly onto those 2 tapes and would automatically split the file accross 2 tapes like so: ------------------ | Copy 1 | | 1.5TB | ------------------ ------------------ | Copy 1 | | 0.5TB | ------------------ ------------------ | Copy 2 | | 1.5TB | ------------------ ------------------ | Copy 2 | | 0.5TB | ------------------ I just want to be able to specify which files go on which tapes as oppose to backups where the tapes just rotate. Thanks.

    Read the article

  • Copy-paste stops working on Windows 7

    - by earlyadopter
    Copy-paste functionality stops working after about an hour after each reboot on a Windows 7 64-bit system. Running Google Chrome (with gmail and few other tabs open like Calendar, Reader), MS Outlook (which I don't think has anything to do with the problem — I saw it when outlook was off as well), iTunes (9.1.1.12 if it matters). Would appreciate hints where and what for to look in a registry, and ideas for possible fix. That is not an Internet Explorer problem (I don't even run it) — it happens in all applications. Neither Ctrl-C/Ctrl-V nor context menu right-click Copy-Paste (actually, nothing happens on Copy, so there is nothing in a clipboard to Paste) are working. Drag-and-Drop (where supported) continues working though.

    Read the article

  • SQL Server "Long running transaction" performance counter: why no workee?

    - by Sleepless
    Please explain to me the following observation: I have the following piece of T-SQL code that I run from SSMS: BEGIN TRAN SELECT COUNT (*) FROM m WHERE m.[x] = 123456 or m.[y] IN (SELECT f.x FROM f) SELECT COUNT (*) FROM m WHERE m.[x] = 123456 or m.[y] IN (SELECT f.x FROM f) COMMIT TRAN The query takes about twenty seconds to run. I have no other user queries running on the server. Under these circumstances, I would expect the performance counter "MSSQL$SQLInstanceName:Transactions\Longest Transaction Running Time" to rise constantly up to a value of 20 and then drop rapidly. Instead, it rises to around 12 within two seconds and then oscillates between 12 and 14 for the duration of the query after which it drops again. According to the MS docs, the counter measures "The length of time (in seconds) since the start of the transaction that has been active longer than any other current transaction." But apparently, it doesn't. What gives?

    Read the article

  • Bash - read as a fallback to $@

    - by user137369
    I have a working bash script (working on OSX) that takes files and directories as input and does something like for inputFile in $@ do [someStuff] done but I want to provide a “fallback”, meaning, if the script is started with no arguments (double-clicked, for example), it can take input at that time, by letting the user drop the files directly on the terminal (possibly through read but not mandatory, I'm open to better/different solutions). I'm guessing I should use some kind of if statement, but I'm not sure how. I'd like to not have to essentially duplicate the script's size by two by repeating [someStuff] for each case. Thank you.

    Read the article

  • Copying files SSH vs sFTP

    - by jackquack
    I'm a bit of a unix noob, but this question seems super basic, yet I can't find an answer anywhere. Basically, to my knowledge, sFTP is just FTP over ssh. So, why can't I drag and drop files from one folder to another on the server side like I can on ssh. Why when I want to unzip a .tar in a server folder, does it first want to copy it to my machine and then back? Why can't it just unzip like it can when I'm using the command line. I know that when I use the command line it is using the resources of the remote machine, but why can't sFTP do that too? Is there a way to execute commands which I would normally do over SSH, but in a gui? I'm tried mapping to the drive to my own machine, I've tried so many sFTP clients that it's silly. Is there another class of program that I just don't know of?

    Read the article

  • What's the largest message size modern email systems generally support?

    - by Phil Hollenback
    I know that Yahoo and Google mail support 25MB email attachments. I have an idea from somewhere that 10MB email messages are generally supported by modern email systems. So if I'm sending an email between two arbitrary users on the internet, what's the safe upper bound on message size? 1MB? 10MB? 25MB? I know that one answer is 'don't send big emails, use some sort of drop box'. I'm looking for a guideline if you are limited to only using regular smtp email.

    Read the article

  • Windows 7 won't load unless other harddrives "disconnect"ed in UEFI shell

    - by lmz
    I have three disks, one GPT partitioned containing Windows 7 and Debian, the other MBR partitioned containing CentOS, and the other one MBR partitioned, empty. It used to work (loading Windows boot manager using rEFIt) but now after installing CentOS and OpenIndiana on the second drive, Windows won't boot. The logo is displayed briefly and then a text mode scrollbar "Loading files", then back to the rEFIt menu. The only thing that makes it work is if I drop into the UEFI shell and run disconnect XX where XX is the device handle of the other hard drives (obtained from running devices). This makes me think that the bootloader is getting confused about where the Windows partition is. Is there any information on how the Windows UEFI boot loader finds the Windows partition, or is there any logging I can turn on to help troubleshoot this issue?

    Read the article

  • DSL connection dropping when starting game

    - by bootoo
    New connection set up last week - have called tech support, signal fine, sending out tech guy tomorrow - however i cant make sense of this using Vista, 2wire connection 6mb ATT conn, wireless, i can browse website, stream video from my PC wirelessly to my xbox, watch Hulu in high res, connection seems fine I load up World of Warcraft, i select Enter World and without fail my 2wire modem 'DSL' light goes red, my internet light goes out, im no longer connected, then after 10 or 15 seconds it works again, but game has kicked me Noticed the same thing on opening or closing Utorrent as well No other phones/devices plugged into the phone jacks in the apartment - why would joining a game cause my DSL to drop? i can only guess it throws a bunch of information to the server when i hit 'enter world' and that causes an issue that shuts my connection down or my router hates me - all help appreciated

    Read the article

  • Installing Windows XP sp3 into USB 2.0 WD 320Gb Hard Drive

    - by NetKabuki
    I have a HP laptop with support for 3 USB 2.0 ports. I also have a clean 320Gb WD USB 2.0 drive. The HP can boot from USB (Bios options). I used the install disk (XP SP3 bootable) and after a few stutters, was actually able to load up a partition on the 320Gb drive with Windows. I cannot consistently get the system to boot up off the USB drive. I am able to drop Ubuntu on that same WD drive and boot up. I can even get Grub2 on Ubuntu to recognize the Win OS. But booting the Win OS is an impossible task. What can I do differently - if anything?

    Read the article

  • Pure-FTPd linux upload problems - 500 Unknown command

    - by user1801273
    I'm running Centos with Kloxo CP and PureFTPd on VPS based on HyperVM. The problem - during the day, there are periods when FTP wors fine for hours, but sometimes it dies with "500 Unknown command" code. for files larger than 2Kb. This is really weird - with files less than 2472-2484 bytes everything would work fine, but with bigger files ftp would start writing file, would drop on 2Kb and would return "500 Unknown command" . This happens on Passive AND/OR Active connections for any filetypes. Logs are showing like the file was uploaded successfully , but the speed nearby the action would be 0.3 -0.5 Kb , when normal operation would report 70-150Kb/s speeds. I do have some Iptables firewalls on both HyperVM host,and VPS , however these are static, and most of the time it works just fine. Any suggestions on where to look for a solution would be appreciated :)

    Read the article

  • Adding Windows 7 to grub4dos menu.lst

    - by antonio
    I am trying to create a multiboot USB drive with grub4dos. I started with a working bootable WinPE-like USB drive, based on Windows 7. I modified the drive MBR with grubinst.exe (hd1), copied in its root grldr and the menu menu.lst file: color blue/green yellow/red white/magenta white/magenta timeout 30 default 0 title Win 7 test rootnoverify (hd0, 0) chainloader /bootmgr I get the error: Try (hd0, 0). This partition is ntfs but with unknown boot record Try (hd0, 1) ... ... Cannot find GRLDR. If I hit a key, anyway it boots Windows 7. I would like to drop to the GRUB command shell, but when I hit "c" Grub boots into Windows.

    Read the article

  • Where in the user profile are the Firefox search engine choices stored?

    - by N Rahl
    We have a large number of user profiles that were created on Ubuntu 10.04 and they had access to Google as a choice in the search bar and Google was the provider for queries typed into the super bar. When logging into these same profiles from Mint 15 client machines, the Google search option does not exist for these users, as is the default for Mint. This setting seems to be user specific, but not a part of the FireFox profile? It seems if it were a part of the FF profile, it would "just work" on Mint for these profiles, so I suspect the configuration may be stored somewhere else in the user's profile? Could someone please tell me where in a user's profile the search engine options are set? We would like to set this once, and then drop this configuration into everyones profile so all of our users don't have to do this manually.

    Read the article

  • Where to learn how to replicate an Excel template?

    - by Rosarch
    This Excel template is really cool. There are a lot of things in it I don't know how to do, such as: Having header rows that "stick" to the top even when you scroll down Slider on the first page changes where the chart pulls its data from Functions seem to be referring to named ranges in tables, like =SUM([nov]). Where do those names come from? Clicking "back to overview" on the "Budget" page returns you to the "Dashboard" page The number under "starting balance" of the top right corner of "Budget" changes when you change cell C5 On "Budget", each cell in the first column of each table has a drop-down menu for text, which seems to come from the "Setup" page The background isn't just plain white, but when I try to format paint it onto a new sheet, nothing happens If you know how any of these effects are achieved, I'm definitely curious. But I guess the main point of my question is where I can go to answer these questions for myself. Are templates explained anywhere?

    Read the article

  • Downloads on Vista Home Premium start off fast but slow down to 0 Kb/s and hang

    - by user66265
    I have Windows Vista Home Premium on my computer and everytime I go to download something, it starts out at about 1.5 Mb/s and stays there for about 3 seconds, then it slows down to 800 Kb/s and continues to drop until it gets down to 0 Kb/s and hangs. I've tried just about everything I can find such as uninstalling all firewalls/antivirus, doing the netsh rss,autotune, and chimney disable, and updating everything but it still continues to happen. I'd prefer not to reinstall but if I have to then I have to... EDIT: Figured it out, the router needed a firmware update

    Read the article

  • DB2 UDF Permissions

    - by WernerCD
    I have a custom function that I'm working on... the problem I'm having is simple: Permssions. example function: drop function circle_area go CREATE FUNCTION circle_area (radius FLOAT) RETURNS FLOAT LANGUAGE SQL BEGIN DECLARE pi FLOAT DEFAULT 3.14; DECLARE area FLOAT; SET area = pi * radius * radius; RETURN area; END GO if I then log out of my "admin" account... and log into test account I get a "Not authorized" error when I try to run something "Select circle_area(foo) from library.bar". I can log into iSeries Navigator, navigate to schema functions permissions and change the permission for public from Exclude to All. bam it works. How do I grant permission to all, either in the CREATE FUNCTION or after?

    Read the article

< Previous Page | 113 114 115 116 117 118 119 120 121 122 123 124  | Next Page >