Search Results

Search found 21947 results on 878 pages for 'static import'.

Page 68/878 | < Previous Page | 64 65 66 67 68 69 70 71 72 73 74 75  | Next Page >

  • Import Twitter XML into Visual Basic

    - by ct2k7
    Hello, I'm trying to import an XML that's provided by twitter into a readable format in Visual Basic. XML looks like: <?xml version="1.0" encoding="UTF-8" ?> - <statuses type="array"> - <status> <created_at>Mon Jan 18 20:41:19 +0000 2010</created_at> <id>111111111</id> <text>thattext</text> <source><a href="http://www.seesmic.com/" rel="nofollow">Seesmic</a></source> <truncated>false</truncated> <in_reply_to_status_id>7916479948</in_reply_to_status_id> <in_reply_to_user_id>90978206</in_reply_to_user_id> <favorited>false</favorited> <in_reply_to_screen_name>radonsystems</in_reply_to_screen_name> - <user> <id>20193170</id> <name>personname</name> <screen_name>screenname</screen_name> <location>loc</location> <description>desc</description> <profile_image_url>http://a3.twimg.com/profile_images/747012343/twitter_normal.png</profile_image_url> <url>myurl</url> <protected>false</protected> <followers_count>97</followers_count> <profile_background_color>ffffff</profile_background_color> <profile_text_color>333333</profile_text_color> <profile_link_color>0084B4</profile_link_color> <profile_sidebar_fill_color>ffffff</profile_sidebar_fill_color> <profile_sidebar_border_color>ababab</profile_sidebar_border_color> <friends_count>76</friends_count> <created_at>Thu Feb 05 21:54:24 +0000 2009</created_at> <favourites_count>1</favourites_count> <utc_offset>0</utc_offset> <time_zone>London</time_zone> <profile_background_image_url>http://a3.twimg.com/profile_background_images/76723999/754686.png</profile_background_image_url> <profile_background_tile>true</profile_background_tile> <notifications>false</notifications> <geo_enabled>true</geo_enabled> <verified>false</verified> <following>false</following> <statuses_count>782</statuses_count> <lang>en</lang> <contributors_enabled>false</contributors_enabled> </user> <geo /> <coordinates /> <place /> <contributors /> </status> </statuses> Now, I want to display it in a panel that automatically refresh after a certain period, however, I only want to pick out certain bits of info from that xml, such as profile_image_url and text and created_at. You can guess how the data will be formatted, much like that presented in TweetDeck and other Twitter clients. I'm quite new to Visual Basic, so how could I do this? Thanks

    Read the article

  • using getScript to import plugin on page using multiple versions of jQuery

    - by mikez302
    I am developing an app on a page that uses jQuery 1.2.6, but I would like to use jQuery 1.4.2 for my app. I really don't like to use multiple versions of jQuery like this but the copy on the page (1.2.6) is something I have no control over. I decided to isolate my code like this: <!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN" "http://www.w3.org/TR/html4/loose.dtd"> <html><head> <script type="text/javascript" src="jquery-1.2.6.min.js> <script type="text/javascript" src="pageStuff.js"> </head> <body> Welcome to our page. <div id="app"> <script type="text/javascript" src="http://ajax.googleapis.com/ajax/libs/jquery/1.4.2/jquery.js"></script> <script type="text/javascript" src="myStuff.js"> </div> </body></html> The file myStuff.js has my own code that is supposed to use jQuery 1.4.2, and it looks like this: (function($) { //wrap everything in function to add ability to use $ var with noConflict var jQuery = $; //my code })(jQuery.noConflict(true)); This is an extremely simplified version, but I hope you get the idea of what I did. For a while, everything worked fine. However, I decided to want to use a jQuery plugin in a separate file. I tested it and it acted funny. After some experimentation, I found out that the plugin was using the old version of jQuery, when I wanted it to use the new version. Does anyone know how to import and run a js file from the context within the function wrapping the code in myStuff.js? In case this matters to anyone, here is how I know the plugin is using the old version, and what I did to try to solve the problem: I made a file called test.js, consisting of this line: alert($.fn.jquery); I tried referencing the file in a script tag the way external Javascript is usually included, below myStuff.js, and it came up as 1.2.6, like I expected. I then got rid of that script tag and put this line in myStuff.js: $.getScript("test.js"); and it still came back as 1.2.6. That wasn't a big surprise -- according to jQuery's documentation, scripts included that way are executed in the global context. I then tried doing this instead: var testFn = $.proxy($.getScript, this); testFn("test.js"); and it still came back as 1.2.6. After some tinkering, I found out that the "this" keyword referred to the window, which I assume means the global context. I am looking for something to put in place of "this" to refer to the context of the enclosing function, or some other way to make the code in the file run from the enclosing function. I noticed that if I copy and paste the code, it works fine, but it is a big plugin that is used in many places, and I would prefer not to clutter up my file with their code. I am out of ideas. Does anyone else know how to do this?

    Read the article

  • How to export ECC key and Cert from NSS DB and import into JKS keystore and Oracle Wallet

    - by mv
    How to export ECC key and Cert from NSS DB and import into JKS keystore and Oracle Wallet In this blog I will write about how to extract a cert and key from NSS Db and import it to a JKS Keystore and then import that JKS Keystore into Oracle Wallet. 1. Set Java Home I pointed it to JRE 1.6.0_22 $ export JAVA_HOME=/usr/java/jre1.6.0_22/ 2. Create a self signed ECC cert in NSS DB I created NSS DB with self signed ECC certificate. If you already have NSS Db with ECC cert (and key) skip this step. $export NSS_DIR=/export/home/nss/ $$NSS_DIR/certutil -N -d . $$NSS_DIR/certutil -S -x -s "CN=test,C=US" -t "C,C,C" -n ecc-cert -k ec -q nistp192 -d . 3. Export ECC cert and key using pk12util Use NSS tool pk12util to export this cert and key into a p12 file      $$NSS_DIR/pk12util -o ecc-cert.p12 -n ecc-cert -d . -W password 4. Use keytool to create JKS keystore and import this p12 file 4.1 Import p12 file created above into a JKS keystore $JAVA_HOME/bin/keytool -importkeystore -srckeystore ecc-cert.p12 -srcstoretype PKCS12 -deststoretype JKS -destkeystore ecc.jks -srcstorepass password -deststorepass password -srcalias ecc-cert -destalias ecc-cert -srckeypass password -destkeypass password -v But if an error as shown is encountered, keytool error: java.security.UnrecoverableKeyException: Get Key failed: EC KeyFactory not available java.security.UnrecoverableKeyException: Get Key failed: EC KeyFactory not available        at com.sun.net.ssl.internal.pkcs12.PKCS12KeyStore.engineGetKey(Unknown Source)         at java.security.KeyStoreSpi.engineGetEntry(Unknown Source)         at java.security.KeyStore.getEntry(Unknown Source)         at sun.security.tools.KeyTool.recoverEntry(Unknown Source)         at sun.security.tools.KeyTool.doImportKeyStoreSingle(Unknown Source)         at sun.security.tools.KeyTool.doImportKeyStore(Unknown Source)         at sun.security.tools.KeyTool.doCommands(Unknown Source)         at sun.security.tools.KeyTool.run(Unknown Source)         at sun.security.tools.KeyTool.main(Unknown Source) Caused by: java.security.NoSuchAlgorithmException: EC KeyFactory not available         at java.security.KeyFactory.<init>(Unknown Source)         at java.security.KeyFactory.getInstance(Unknown Source)         ... 9 more 4.2 Create a new PKCS11 provider If you didn't get an error as shown above skip this step. Since we already have NSS libraries built with ECC, we can create a new PKCS11 provider Create ${java.home}/jre/lib/security/nss.cfg as follows: name = NSS     nssLibraryDirectory = ${nsslibdir}    nssDbMode = noDb    attributes = compatibility where nsslibdir should contain NSS libs with ECC support. Add the following line to ${java.home}/jre/lib/security/java.security :      security.provider.9=sun.security.pkcs11.SunPKCS11 ${java.home}/lib/security/nss.cfg Note that those who are using Oracle iPlanet Web Server or Oracle Traffic Director, NSS libs built with ECC are in <ws_install_dir>/lib or <otd_install_dir>/lib. 4.3. Now keytool should work Now you can try the same keytool command and see that it succeeds : $JAVA_HOME/bin/keytool -importkeystore -srckeystore ecc-cert.p12 -srcstoretype PKCS12 -deststoretype JKS -destkeystore ecc.jks -srcstorepass password -deststorepass password -srcalias ecc-cert -destalias ecc-cert -srckeypass password -destkeypass password -v [Storing ecc.jks] 5. Convert JKS keystore into an Oracle Wallet You can export this cert and key from JKS keystore and import it into an Oracle Wallet if you need using orapki tool as shown below. Make sure that orapki you use supports ECC. Also for ECC you MUST use "-jsafe" option. $ orapki wallet create -pwd password  -wallet .  -jsafe $ orapki wallet jks_to_pkcs12 -wallet . -pwd password -keystore ecc.jks -jkspwd password -jsafe AS $orapki wallet display -wallet . -pwd welcome1  -jsafeOracle PKI Tool : Version 11.1.2.0.0Copyright (c) 2004, 2012, Oracle and/or its affiliates. All rights reserved.Requested Certificates:User Certificates:Subject:        CN=test,C=USTrusted Certificates:Subject:        OU=Class 3 Public Primary Certification Authority,O=VeriSign\, Inc.,C=USSubject:        CN=GTE CyberTrust Global Root,OU=GTE CyberTrust Solutions\, Inc.,O=GTE Corporation,C=USSubject:        OU=Class 2 Public Primary Certification Authority,O=VeriSign\, Inc.,C=USSubject:        OU=Class 1 Public Primary Certification Authority,O=VeriSign\, Inc.,C=USSubject:        CN=test,C=US As you can see our ECC cert in the wallet. You can follow the same steps for RSA certs as well. 6. References http://icedtea.classpath.org/bugzilla/show_bug.cgi?id=356 http://old.nabble.com/-PATCH-FOR-REVIEW-%3A-Support-PKCS11-cryptography-via-NSS-p25282932.html http://www.mozilla.org/projects/security/pki/nss/tools/pk12util.html

    Read the article

  • How to import in BIDS more than one SSIS package in one shot!

    - by Luca Zavarella
    Have you ever wanted to add more than one Integration Services existing package (e.g. 20 packages) in a SSIS project? Well, you may suppose that an Open Dialog supports multiple files selection to import more than one file at a time ... BIDS Open Dialog doesn’t allow this, you can just select a single file! Hence the loss of valuable time spent to import the packages one at a time. Few days ago I learned a trick that solves the problem, thanks to this post by Matt Masson. Just copy all the packages to import from Windows Explorer (Ctrl + C): Then just right click on the SSIS Packages folder of the Integration Services project and make a simple Past (CTRL + V): So “auto-magically” you’ll have all those packages imported in your Integration Services project!! What can I say... this feature was well hidden!

    Read the article

  • Moving a .aspx site over to nginx, need to server .aspx as static files or rewrite

    - by Blankman
    I have a simple CMS site that was written in asp.net. My site looks like: www.example.com/ www.example.com/content/index.aspx www.example.com/content/get.aspx?id=234 (loads an article) It uses a database currently, but I am going to dump all the content to file, and then I can just pull the contents of the file based on the id=234 value. I want to move this site over to my ubuntu nginx server. What options do I have? Suggestions? I want to keep the URL structure exactly how it is now, how can I do this? Would this be any easier using apache?

    Read the article

  • NLB NIC w/ Static IP permanently "Acquiring Network Address" after adding more IP's to the cluster

    - by I.T. Support
    We have a 2 node NLB cluster. I recently added more IP's to the cluster range, after which point, the NLB NIC's on both of my nodes ended up in a permanent "acquiring network address" state. The cluster appears to be functioning despite this hang in acquiring an address. We've also rebooted the machines, and these NIC's remain in this state "acquiring network address" even after reboot, which I find troubling. Questions: Does anyone know where I can find documentation on how NLB binds to a NIC? (are there dll's involved, drivers, etc.) Does anyone know of any programs that have known incompatability issues with NLB? We have an application server installed on these NLB nodes (AOL Server) that interacts with the NIC's as well, and I suspect that it might be causing this issue. Before I go directly to Microsoft, I want to gather as much information as possible about the issue. Any help / guidance would be much appreciated...

    Read the article

  • Multiple public IPs through DD-WRT without 1-to-1 NAT

    - by Stephen Touset
    I've done a search here and wasn't able to find anything relevant to my situation. I apologize in advance if I've missed an existing post on the topic. Our ISP has provided us with 6 static IP addresses. We are currently using two of them (plus one for the Comcast-provided router). One of the static addresses routes to our internal network, and the other goes to our VOIP phone system. Unfortunately, the Comcast machine doesn't support QoS, so our VOIP calls have been choppy. We plan to put the Comcast-provided router into bridge mode and replace it with an ASUS RT-N16 running DD-WRT. However, I'm unsure how to set up DD-WRT to function similarly to our existing Comcast router. The Comcast router's WAN IP is the first of our static IP addresses. We did not need to provide an internal LAN IP address — simply connecting machines that use our other public addresses to the LAN ports on the Comcast router is enough for it to route between the connected machines and our internet connection. Is there a way to do a similar setup through the DD-WRT? Thanks in advance.

    Read the article

  • Adding Static IP's to the NIC

    - by Brett Powell
    We are currently working on migrating a lot of new machines to our network, and my job this morning was to setup all of the IP Addresses. I worked on this all morning, and when I got back tonight I was informed that they had all been setup incorrectly, and had to be removed and re-added. I am quite confused as I have been setting up IP's on machines for a long time and I am curious as to what the issue is. Just taking into account this example... 72.26.196.160/29 255.255.255.248 A /29 block is 5 usable IP's. With the script I wrote and used, the IP Addresses .162 - .166 were added to the NIC. I can't remember now what the name for .161 was, but isn't it the broadcast address or something which isn't assigned to the NIC when adding additional IP Blocks? I am curious as to where my logic is failing me. Not to mention even if .161 was to be added, there is no reason why all of the IPs would have to be removed, as .161 could just be added in addition to these.

    Read the article

  • Cisco ASA and static IPv6 tunnel endpoint?

    - by Martijn Heemels
    I recently installed a Cisco ASA 5505 firewall on the edge of our LAN. The setup is simple: Internet <-- ASA <-- LAN I would like provide the hosts in the LAN with IPv6 connectivity by setting up a 6in4 tunnel to SixXS. It would be nice to have the ASA as tunnel endpoint so it can firewall both IPv4 and IPv6 traffic. Unfortunately the ASA apparently can't create a tunnel itself, and can't port-forward protocol 41 traffic, so I believe I would have to do one of the following instead: Set up a host with it's own IP outside the firewall, and have that function as tunnel-endpoint. The ASA can then firewall and route the v6 subnet to the LAN. Set up a host inside the firewall that functions as endpoint, separated via vlan or whatever, and loop the traffic back into the ASA where it can be firewalled and routed. This seems contrived, but would allow me to use a VM instead of a physical machine as endpoint. Any other way? What would you suggest is the optimal way to set this up? P.S. I do have a spare public IP address available if needed, and can spin up another VM in our VMware infrastructure.

    Read the article

  • Speaker static event fires with every action

    - by P.Brian.Mackey
    When I... click my mouse (everytime) stroke a key on keyboard (sometimes) Watch a video (horrible) Take the hard drive out for a walk (bad) The Cyber Acoustics CA-2014rb speakers chime in with an onomatopoeia. Sounds like radio interference. A click for every platter revolution. A chime for every 80Kb of data. This is a virus type Pokémon immune to formats, OS upgrades and everything but volume control, power outages and deep submersions. How can I defeat the monster rattling in my PC?

    Read the article

  • I love google Chrome, but some non-static pages like Piwik render it unresponsive

    - by gogowitsch
    The web-stat software Piwik stops reacting on mouse clicks after 1-2 seconds. The same is true for Google Maps and Producteev (but GMail and most other pages work like a charm). These rely heavily on JS, and work without Flash. I can click for a very short time period and then the mouse cursor doesn't feel the UI anymore (it doesn't turn into a I over input fields, though it moves; if the freeze occured while the pointer was over an input field, the cursor keeps being a I) and all clicks on the DOM are being ignored by Chrome. No message appears, neither obvious nor in the Console (F12). There is no obstructing div or the like in the DOM (F12). Since I couldn't find any hints on the source of my problems, I suspected my plugins and extensions. Unfortunately, neither deactivating all plugins nor all extensions solved the problem. for the problematic pages, it always happens no Dropbox running several GB of free RAM the taskmanager doesn't show any high CPU or memory utilization (the offending tab uses 30 MB and uses 0-1 % CPU) all problematic pages work in other browsers (Chrome, Firefox, IE) the rest of the computer is very responsive the computers use different security suites (Kaspersky and Avira) The effect exists between several (synchronized) Chrome instances on different machines, all running Windows 7. Both the OS and Chrome are updated automatically. Other tabs and the Chrome chrome (tabs, menus, toolbar buttons of the browser itself) still work. I really don't like switching between browsers. Any ideas?

    Read the article

  • IPv6 static routes

    - by user98651
    I am looking to configure a few hosts with IPv6 on my network. The router (running CentOS 5) is configured with an Hurricane Electric (HE) tunnel which works fine on that host. However, I would like to statically add a few additional hosts on the same LAN to have IPv6 through this tunnel. No, I don't want radvd or dhcpv6 to do the work for me in this case. I already have IPv6 forwarding enabled in sysctl.conf. I am looking for help with the next steps (statically adding the routes). Lets say the IP addresses are as follows: Router: 2001:470:1b07:1:: Host1: 2001:470:1b07:2:: How would I go about making them see each other? Thanks in advance for the help.

    Read the article

  • multiple ip for a server not reachable

    - by andrewk
    FYI: I've read everything on Serverfault related to this question and have faced a different issue. Simply put, I've got one server (apache2) with couple of sites on it. It currently has 1 ip. I'm trying to assign/add another ip to that server, so I can give each site a different ip for ssl purposes. I am not lucking out. The new ip simply is unreachable, I've pinged it. This is what I've got below, what am I doing wrong. auto lo iface lo inet loopback auto eth0 eth0:0 eth0:1 iface eth0 inet static address 70.116.5.244 netmask 255.255.255.0 gateway 70.116.5.1 #THE NEW IP iface eth0:0 inet static address 26.175.217.102 netmask 255.255.255.0 #PRIVATE IP iface eth0:1 inet static address 192.168.158.88 netmask 255.255.128.0 NOTE: THESE IP'S ARE TWEAKED BUT RELATIVE I've read many questions here 90% similar to this but most actually have the IP respond, not this case. Thanks netstar -r output Kernel IP routing table Destination Gateway Genmask Flags MSS Window irtt Iface default gw-u6.linode.co 0.0.0.0 UG 0 0 0 eth0 70.116.5.0 * 255.255.255.0 U 0 0 0 eth0 26.175.217.0 * 255.255.255.0 U 0 0 0 eth0 192.168.128.0 * 255.255.128.0 U 0 0 0 eth0

    Read the article

  • Importing photoimpact .ab3 file database into another album program

    - by Mel F
    I have Photoimpact Album v5 which has now maxed out at 65,000 photos with database information. I can export the database and file information to a csv or dbase 4 file but I cannot find any way to import the information to a competative Photo Organization/Database product like Adobe Lightroom, ACDSee Pro3 or Photo Collector, etc... which is more modern, will work on Vista/Windows7, and not have any file limitations. Any help would be greatly appreciated. I really cannot reinput data information for 65,000 photos manually. I would use any good software that I can transfer my files and data ifno to.

    Read the article

  • Can not change to a static IP in Fedora 19

    - by user196272
    Im having a bit of a weird situation. Ive installed Fedora Linux 19 onto a virtual machine with no GUI. initially eth0 does not show up when I perform ifconfig. when I run dmesg | grep eth I see the adapter but it says it changed names to p2p1. Once I perform the ifconfig p2p1 up command it shows up. Now when I try to edit the /etc/sysconfig/network-scripts/ifcfg-p2p1, it does not exist. the only scripts that are there lo and enp0s3. If I try to create the ifcfg-p2p1 file with the correct settings, I can not restart the network service. I tried editing the enp0s3 file, but that did not work. Im fairly new to linux and not sure what else to put in here, so if you need any more information just let me know and Ill put it in here.

    Read the article

  • Update mysql database with arpwatch textfile database

    - by bVector
    I'm looking to keep arpwatch entries in a mysql database to crossreference with other information I'm storing based on mac addresses. I've manually imported the arpwatch database into my mysql database, but being a novice with databases I'm not sure what the best way to continually update the database with new entries without creating duplicates would be. None of the fields can be unique, as even the time is duplicated frequently. I'm not interested in the actual arpwatch events like flip flop or new station, just the mac/ip/time pairings. Would a simple bash (or sql) shell script do the trick? Would it be possible to make the mac address plus the time be a composite key of some sort? the database is called utility, table is arpwatch, columns are mac, ip, time a seperate table named 'hosts' with columns mac, ip, type, hostname, location, notes has mac as the primary key. This table will correlate different ip addresses that a mac had over time using the arpwatch column initial import was done with MySQL Workbench using INSERT INTO commands with creative search and replace on the text file

    Read the article

  • nginx: try_files not finding static files, falling back to PHP

    - by Wells Oliver
    Relevant configuration: location /myapp { root /home/me/myapp/www; try_files $uri $uri/ /myapp/index.php?url=$uri&$args; location ~ \.php { fastcgi_pass 127.0.0.1:9000; fastcgi_index index.php; include /etc/nginx/fastcgi_params; fastcgi_param SCRIPT_FILENAME $document_root/index.php; } } I absolutely have a file foo.html in /home/me/myapp/www but when I browse to /myapp/foo.html it is handled by PHP, the final fallback in the try_files list. Why is this happening?

    Read the article

  • New Static Website with Hosted DNS alternating 502, 503 and Page Does Not Exist Errors

    - by Dave
    This has become an increasingly frustrating ordeal. I'm mostly a web developer, so forgive me if I am using improper terminology here. I have a client that had purchased a domain at JustHost. We built him a website and have it on our own server space. Now, I'm mostly used to dealing with godaddy and it is simple enough to manage dns records and point the A record to our server IP, where Apache on our end deals with the domains via name-based virtual hosts. But for some reason, in setting this up with JustHost, when attempting to go to the domain name, I either get a 502 or 503 error or "webpage does not exist". Now, I know that the basic functionality of the webpage must be working because I can access the the index etc straight through my servers www data (IE [server-ip]/website_folder). I was on the phone with technical support for over three hours yesterday with justhost and the best I could get was "That's really weird..." I've checked my logs and there doesn't seem to be anything coming through to my end. Does anybody have an idea of whats going on here? I would love for it to be a problem on my end, because justhost doesn't seem capable of helping further. Any help is greatly appreciated, thanks. I forgot to mention that we have several other sites up and running and completely accessible.

    Read the article

  • Maximum execution time of 300 seconds exceeded error while importing large MySQL database

    - by Spacedust
    I'm trying to import 641 MB MySQL database with a command: mysql -u root -p ddamiane_fakty < domenyin_damian_fakty.sql but I got an error: ERROR 1064 (42000) at line 2351406: You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near '<br /> <b>Fatal error</b>: Maximum execution time of 300 seconds exceeded in <b' at line 253 However limits are set much higher: mysql> show global variables like "interactive_timeout"; +---------------------+-------+ | Variable_name | Value | +---------------------+-------+ | interactive_timeout | 28800 | +---------------------+-------+ 1 row in set (0.00 sec) and mysql> show global variables like "wait_timeout"; +---------------+-------+ | Variable_name | Value | +---------------+-------+ | wait_timeout | 28800 | +---------------+-------+ 1 row in set (0.00 sec)

    Read the article

  • What's the simplest way to get .flac music files into iTunes?

    - by Chris Adams
    I'm looking for a simple way to import .flac files into iTunes, so I can play them on my mac, and when I'm out, my iPhone, and I'm willing to be I'm not the first person to want to do this. What are the best tools for doing this? The quality of the music is useful, but truth be told, the speakers I'd play them on are so crappy that a lot of the sound quality in the .flac format would be lost anyway, so I'm not averse to converting to mp3 files. If it helps give any context, I'm using a Macbook with OS 10.5 Leopard, and iTunes 9, connected to a 16gb iPhone with standard apple headphones, which is sometimes plugged into Bose SoundDock for music, and the music files are piano performances. C

    Read the article

  • Caching static content from Adobe, Microsoft, etc

    - by Tim
    I'm currently running the Apple SUS on a Mac OS X Server in a small office environment. It works well for Apple updates, but I'm still stuck with either manually downloading and installing Adobe/Microsoft updates on each computer or running them through a Squid cache, with the blind faith that Squid will keep the files I actually want to stay cached. What is the best way to cache updates locally for applications like the Adobe Updater or Microsoft AutoUpdate? Ideally cached in such a way that I can tell which files I do or do not have cached. It would also be nice to be able to cache things for other software like Firefox and Sparkle-enabled apps, but these are usually small enough to ignore.

    Read the article

  • Configure iptables with a bridge and static IPs

    - by Andrew Koester
    I have my server set up with several public IP addresses, with a network configuration as follows (with example IPs): eth0 \- br0 - 1.1.1.2 |- [VM 1's eth0] | |- 1.1.1.3 | \- 1.1.1.4 \- [VM 2's eth0] \- 1.1.1.5 My question is, how do I set up iptables with different rules for the actual physical server as well as the VMs? I don't mind having the VMs doing their own iptables, but I'd like br0 to have a different set of rules. Right now I can only let everything through, which is not the desired behavior (as br0 is exposed). Thanks!

    Read the article

  • Wrap app with dynamic libraries into one large static app

    - by progo
    I have an old program that kind of depends on older dynamic libraries. They tend to get upgraded easily with distro's updates. I figured that there would be a script with using ldd that would gather the libs needed and create one bigger, statically linked application that wouldn't break so easily. If I could do this, alot of older KDE libraries could be removed from my system and easen my life. Thanks!

    Read the article

  • 404 with serving static files in a custom nginx configuration

    - by code90
    In my nginx configuration, I have the following: location /admin/ { alias /usr/share/php/wtlib_4/apps/admin/; location ~* .*\.php$ { try_files $uri $uri/ @php_admin; } location ~* \.(js|css|png|jpg|jpeg|gif|ico|pdf|zip|rar|air)$ { expires 7d; access_log off; } } location ~ ^/admin/modules/([^/]+)(.*\.(html|js|json|css|png|jpg|jpeg|gif|ico|pdf|zip|rar|air))$ { alias /usr/share/php/wtlib_4/modules/$1/admin/$2; } location ~ ^/admin/modules/([^/]+)(.*)$ { try_files $uri @php_admin_modules; } location @php_admin { if ($fastcgi_script_name ~ /admin(/.*\.php)$) { set $valid_fastcgi_script_name $1; } fastcgi_pass $byr_pass; fastcgi_index index.php; fastcgi_param SCRIPT_FILENAME /usr/share/php/wtlib_4/apps/admin$valid_fastcgi_script_name; fastcgi_param REDIRECT_STATUS 200; include /etc/nginx/fastcgi_params; } location @php_admin_modules { if ($fastcgi_script_name ~ /admin/modules/([^/]+)(.*)$) { set $byr_module $1; set $byr_rest $2; } fastcgi_pass $byr_pass; fastcgi_index index.php; fastcgi_param SCRIPT_FILENAME /usr/share/php/wtlib_4/modules/$byr_module/admin$byr_rest; fastcgi_param REDIRECT_STATUS 200; include /etc/nginx/fastcgi_params; } Following is the requested url which ends up with "404": http://www.{domainname}.com/admin/modules/cms/styles/cms.css Following is the error log: [error] 19551#0: *28 open() "/usr/share/php/wtlib_4/apps/admin/modules/cms/styles/cms.css" failed (2: No such file or directory), client: xxx.xxx.xxx.xxx, server: {domainname}.com, request: "GET /admin/modules/cms/styles/cms.css HTTP/1.1", host: "www.{domainname}.com" Following urls works fine: http://www.{domainname}.com/admin/modules/store/?a=manage http://www.{domainname}.com/admin/modules/cms/?a=cms.load Can anyone see what the problem could be? Thanks. PS. I am trying to migrate existing sites from apache to nginx.

    Read the article

< Previous Page | 64 65 66 67 68 69 70 71 72 73 74 75  | Next Page >