Search Results

Search found 7249 results on 290 pages for 'https everywhere'.

Page 70/290 | < Previous Page | 66 67 68 69 70 71 72 73 74 75 76 77  | Next Page >

  • How to properly deny Railo directory access through Apache

    - by Sn3akyP3t3
    I've been battle tested on this and failed to achieve my goal which is to deny all access to all directories except the Public directory and only allow access to all all other directories with specific IP addresses. To get Railo+Apache+Tomcat installed I pretty much followed this script: https://github.com/talltroym/Railo-Ubuntu-Installer-Script then verified settings with this tutorial: http://blog.nictunney.com/2012/03/railo-tomcat-and-apache-on-amazon-ec2.html From the installation script these mods are enabled: sudo a2enmod ssl sudo a2enmod proxy sudo a2enmod proxy_http sudo a2enmod rewrite sudo a2ensite default-ssl Outside of the script I copied the sites-available to sites-enabled then reloaded Apache. I have a directory created for Railo cmfl located at /var/www/Railo/ Navigating the browser to http ://Server_IP_Address/Railo forces ssl and relocates to https ://Server_IP_Address/Railo which shows off index.cfm. Not providing index.cfm and omitting https indicates that the DirectoryIndex directive and RewriteCond of Apache appears to be working for the sites-enabled VirtualHost. The problem I'm encountering is that I cannot seem to deny access to all directories except Public. My directory structure is rather simple and looks like this: Railo error Public NotPublic Sandbox These are my sites-enabled configurations: <VirtualHost *:80> ServerAdmin webmaster@localhost DocumentRoot /var/www #Default Deny All to prevent walking backwards in file system Alias /Railo/ "/var/www/Railo/" <Directory ~ ".*/Railo/(?!Public).*"> Order Deny,Allow Deny from All </Directory> ScriptAlias /cgi-bin/ /usr/lib/cgi-bin/ <Directory "/usr/lib/cgi-bin"> AllowOverride None Options +ExecCGI -MultiViews +SymLinksIfOwnerMatch Order allow,deny Allow from all </Directory> ErrorLog ${APACHE_LOG_DIR}/error.log # Possible values include: debug, info, notice, warn, error, crit, # alert, emerg. LogLevel warn CustomLog ${APACHE_LOG_DIR}/access.log combined Alias /doc/ "/usr/share/doc/" <Directory "/usr/share/doc/"> Options Indexes MultiViews FollowSymLinks AllowOverride None Order deny,allow Deny from all Allow from 127.0.0.0/255.0.0.0 ::1/128 </Directory> DirectoryIndex index.cfm index.cfml default.cfm default.cfml index.htm index.html index.cfc RewriteEngine on RewriteCond %{SERVER_PORT} !^443$ RewriteRule ^.*$ https://%{SERVER_NAME}%{REQUEST_URI} [L,R] </VirtualHost> and <IfModule mod_ssl.c> <VirtualHost _default_:443> ServerAdmin webmaster@localhost DocumentRoot /var/www Alias /Railo/ "/var/www/Railo/" <Directory ~ "/var/www/Railo/(?!Public).*"> Order Deny,Allow Deny from All </Directory> ScriptAlias /cgi-bin/ /usr/lib/cgi-bin/ <Directory "/usr/lib/cgi-bin"> AllowOverride None Options +ExecCGI -MultiViews +SymLinksIfOwnerMatch Order allow,deny Allow from all </Directory> ErrorLog ${APACHE_LOG_DIR}/error.log # Possible values include: debug, info, notice, warn, error, crit, # alert, emerg. LogLevel warn CustomLog ${APACHE_LOG_DIR}/ssl_access.log combined Alias /doc/ "/usr/share/doc/" <Directory "/usr/share/doc/"> Options Indexes MultiViews FollowSymLinks AllowOverride None Order deny,allow Deny from all Allow from 127.0.0.0/255.0.0.0 ::1/128 </Directory> # SSL Engine Switch: # Enable/Disable SSL for this virtual host. SSLEngine on # A self-signed (snakeoil) certificate can be created by installing # the ssl-cert package. See # /usr/share/doc/apache2.2-common/README.Debian.gz for more info. # If both key and certificate are stored in the same file, only the # SSLCertificateFile directive is needed. SSLCertificateFile /etc/ssl/certs/ssl-cert-snakeoil.pem SSLCertificateKeyFile /etc/ssl/private/ssl-cert-snakeoil.key # Server Certificate Chain: # Point SSLCertificateChainFile at a file containing the # concatenation of PEM encoded CA certificates which form the # certificate chain for the server certificate. Alternatively # the referenced file can be the same as SSLCertificateFile # when the CA certificates are directly appended to the server # certificate for convinience. #SSLCertificateChainFile /etc/apache2/ssl.crt/server-ca.crt # Certificate Authority (CA): # Set the CA certificate verification path where to find CA # certificates for client authentication or alternatively one # huge file containing all of them (file must be PEM encoded) # Note: Inside SSLCACertificatePath you need hash symlinks # to point to the certificate files. Use the provided # Makefile to update the hash symlinks after changes. #SSLCACertificatePath /etc/ssl/certs/ #SSLCACertificateFile /etc/apache2/ssl.crt/ca-bundle.crt # Certificate Revocation Lists (CRL): # Set the CA revocation path where to find CA CRLs for client # authentication or alternatively one huge file containing all # of them (file must be PEM encoded) # Note: Inside SSLCARevocationPath you need hash symlinks # to point to the certificate files. Use the provided # Makefile to update the hash symlinks after changes. #SSLCARevocationPath /etc/apache2/ssl.crl/ #SSLCARevocationFile /etc/apache2/ssl.crl/ca-bundle.crl # Client Authentication (Type): # Client certificate verification type and depth. Types are # none, optional, require and optional_no_ca. Depth is a # number which specifies how deeply to verify the certificate # issuer chain before deciding the certificate is not valid. #SSLVerifyClient require #SSLVerifyDepth 10 # Access Control: # With SSLRequire you can do per-directory access control based # on arbitrary complex boolean expressions containing server # variable checks and other lookup directives. The syntax is a # mixture between C and Perl. See the mod_ssl documentation # for more details. #<Location /> #SSLRequire ( %{SSL_CIPHER} !~ m/^(EXP|NULL)/ \ # and %{SSL_CLIENT_S_DN_O} eq "Snake Oil, Ltd." \ # and %{SSL_CLIENT_S_DN_OU} in {"Staff", "CA", "Dev"} \ # and %{TIME_WDAY} >= 1 and %{TIME_WDAY} <= 5 \ # and %{TIME_HOUR} >= 8 and %{TIME_HOUR} <= 20 ) \ # or %{REMOTE_ADDR} =~ m/^192\.76\.162\.[0-9]+$/ #</Location> # SSL Engine Options: # Set various options for the SSL engine. # o FakeBasicAuth: # Translate the client X.509 into a Basic Authorisation. This means that # the standard Auth/DBMAuth methods can be used for access control. The # user name is the `one line' version of the client's X.509 certificate. # Note that no password is obtained from the user. Every entry in the user # file needs this password: `xxj31ZMTZzkVA'. # o ExportCertData: # This exports two additional environment variables: SSL_CLIENT_CERT and # SSL_SERVER_CERT. These contain the PEM-encoded certificates of the # server (always existing) and the client (only existing when client # authentication is used). This can be used to import the certificates # into CGI scripts. # o StdEnvVars: # This exports the standard SSL/TLS related `SSL_*' environment variables. # Per default this exportation is switched off for performance reasons, # because the extraction step is an expensive operation and is usually # useless for serving static content. So one usually enables the # exportation for CGI and SSI requests only. # o StrictRequire: # This denies access when "SSLRequireSSL" or "SSLRequire" applied even # under a "Satisfy any" situation, i.e. when it applies access is denied # and no other module can change it. # o OptRenegotiate: # This enables optimized SSL connection renegotiation handling when SSL # directives are used in per-directory context. #SSLOptions +FakeBasicAuth +ExportCertData +StrictRequire <FilesMatch "\.(cgi|shtml|phtml|php)$"> SSLOptions +StdEnvVars </FilesMatch> <Directory /usr/lib/cgi-bin> SSLOptions +StdEnvVars </Directory> # SSL Protocol Adjustments: # The safe and default but still SSL/TLS standard compliant shutdown # approach is that mod_ssl sends the close notify alert but doesn't wait for # the close notify alert from client. When you need a different shutdown # approach you can use one of the following variables: # o ssl-unclean-shutdown: # This forces an unclean shutdown when the connection is closed, i.e. no # SSL close notify alert is send or allowed to received. This violates # the SSL/TLS standard but is needed for some brain-dead browsers. Use # this when you receive I/O errors because of the standard approach where # mod_ssl sends the close notify alert. # o ssl-accurate-shutdown: # This forces an accurate shutdown when the connection is closed, i.e. a # SSL close notify alert is send and mod_ssl waits for the close notify # alert of the client. This is 100% SSL/TLS standard compliant, but in # practice often causes hanging connections with brain-dead browsers. Use # this only for browsers where you know that their SSL implementation # works correctly. # Notice: Most problems of broken clients are also related to the HTTP # keep-alive facility, so you usually additionally want to disable # keep-alive for those clients, too. Use variable "nokeepalive" for this. # Similarly, one has to force some clients to use HTTP/1.0 to workaround # their broken HTTP/1.1 implementation. Use variables "downgrade-1.0" and # "force-response-1.0" for this. BrowserMatch "MSIE [2-6]" \ nokeepalive ssl-unclean-shutdown \ downgrade-1.0 force-response-1.0 # MSIE 7 and newer should be able to use keepalive BrowserMatch "MSIE [17-9]" ssl-unclean-shutdown DirectoryIndex index.cfm index.cfml default.cfm default.cfml index.htm index.html #Proxy .cfm and cfc requests to Railo ProxyPassMatch ^/(.+.cf[cm])(/.*)?$ http://127.0.0.1:8888/$1 ProxyPassReverse / http://127.0.0.1:8888/ #Deny access to admin except for local clients <Location /railo-context/admin/> Order deny,allow Deny from all # Allow from <Omitted> # Allow from <Omitted> Allow from 127.0.0.1 </Location> </VirtualHost> </IfModule> The apache2.conf includes the following: # Include the virtual host configurations: Include sites-enabled/ <IfModule !mod_jk.c> LoadModule jk_module /usr/lib/apache2/modules/mod_jk.so </IfModule> <IfModule mod_jk.c> JkMount /*.cfm ajp13 JkMount /*.cfc ajp13 JkMount /*.do ajp13 JkMount /*.jsp ajp13 JkMount /*.cfchart ajp13 JkMount /*.cfm/* ajp13 JkMount /*.cfml/* ajp13 # Flex Gateway Mappings # JkMount /flex2gateway/* ajp13 # JkMount /flashservices/gateway/* ajp13 # JkMount /messagebroker/* ajp13 JkMountCopy all JkLogFile /var/log/apache2/mod_jk.log </IfModule> I believe I understand most of this except the jk_module inclusion which I've noticed has an error that shows up in the logs that I can't sort out: [warn] No JkShmFile defined in httpd.conf. Using default /etc/apache2/logs/jk-runtime-status I've checked my Regular expression against the paths of the directories with RegexBuddy just to be sure that I wasn't correct. The problem doesn't appear to be Regex related although I may have something incorrect in the Directory directive. The Location directive seems to be working correctly for blocking out Railo admin site access.

    Read the article

  • Updating Banshee to 2.4

    - by Lucasguy11
    I have banshee 2.2.1 with Ubuntu 11.10 I have been trying to update banshee to 2.4 (released yesterday) but it just isnt working, I have been using sudo add-apt-repository ppa:banshee-team/ppa in terminal, from the Banshee.fm website. but after running through terminal it says this: sudo add-apt-repository ppa:banshee-team/ppa You are about to add the following PPA to your system: PPA for Banshee Team This PPA contains the latest stable debs of Banshee for Ubuntu. To install Banshee, you must first enable the PPA on your system: 1. Open Software Sources (System->Administration->Software Sources) 2. Navigate to the "Third Party Sources" tab. 3. Click "Add" 4. Enter the APT line below that corresponds to your Ubuntu version that starts with "deb". 5. Click "Add Source" 6. Click "Close" 7. It will prompt you to reload your software cache. Click "Reload". 8. Now install the package "banshee" from Synaptic, or using the command below: sudo apt-get install banshee For those who wish to compile from trunk, add the deb-src line and then run "sudo apt-get build-dep" to install all required dependencies before starting to compile. Unstable (version which have odd minor version numbers) debs of Banshee can be found here: https://launchpad.net/~banshee-team/+archive/banshee-unstable More info: https://launchpad.net/~banshee-team/+archive/ppa Press [ENTER] to continue or ctrl-c to cancel adding it Executing: gpg --ignore-time-conflict --no-options --no-default-keyring --secret-keyring /tmp/tmp.OPAjxemDQr --trustdb-name /etc/apt/trustdb.gpg --keyring /etc/apt/trusted.gpg --primary-keyring /etc/apt/trusted.gpg --keyserver hkp://keyserver.ubuntu.com:80/ --recv 9D2C2E0A3C88DD807EC787D74874D3686E80C6B7 gpg: requesting key 6E80C6B7 from hkp server keyserver.ubuntu.com gpg: key 6E80C6B7: "Launchpad PPA for Banshee Team" not changed gpg: Total number processed: 1 gpg: unchanged: 1 I believe I have the ppa but, im not sure. I need a step by step process to get this, ive been trying to figure it out for quite a while now...

    Read the article

  • Booting from USB on Mac Air (using setup_mac_usb_boot.sh)

    - by Mike O
    So, I've been working on this for hours and it's getting a little tiring. As some of you may know, installing Ubuntu on Macs is frequently an adventure, and I'm experiencing that right now. The part I'm hung up on at the moment is making a bootable USB. I would just use a CD, but my laptop is a MacBook Air (which doesn't have a CD drive), and I don't own an external CD drive. I initially attempted to use the command line method supplied by the Ubuntu documentation here: https://help.ubuntu.com/community/How%20to%20install%20Ubuntu%20on%20MacBook%20using%20USB%20Stick However, that wasn't even recognized by rEFIt even when I made a number of different modifications to the process, so I quickly decided to look elsewhere. I came across this guide: https://help.ubuntu.com/community/MacBookAir4-2#Basic_Installation_Instructions This ended up working to a large extent. If I choose the supplied grub from rEFIt, it will bring me to the Ubuntu grub, asking me to try it, install, or check the disk. And if I choose to boot Linux directly from rEFIt, it will bring me to the language selection menu. But when I make my selection from either of these menus it pauses for about ten seconds and then gives me a command line error message. It begins with kernel panic - not syncing timer doesn't work through interrupt, and then shows about eight file names. Does anyone here have any ideas as to what can be causing this? I also tried the script with both Ubuntu 11.10 (the current version when the script was written) and 12.04.

    Read the article

  • Proper way to remove an active / inactive LVM snapshot

    - by user2622247
    I have created a sample ruby script file for removing extra LVM snapshots from the system. For removing LVM snapshot, we are using lvremove command. This command is working fine and we can remove snapshots from the system. # sudo lvremove /dev/ops/dbbackup lvremove -- do you really want to remove "/dev/ops/dbbackup"? [y/n]: y Sometimes while removing snapshots we are getting following errors. Unable to deactivate open rootfs_12.10_20140812_00-cow (252:8) Failed to resume rootfs_12.10_20140812_00. libdevmapper exiting with 7 device(s) still suspended. The system gets frozen. We cannot fire any command or can not perform any action on it. After restarting the system, it is functioning fine. We can perform all the operations even we can delete that snapshot also. I searched about it I found these threads https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=659762 and https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=674682 Solution from this thread is after getting the error but I have to avoid this type of error. I have a question, Is there any better way removing LVM snapshots? So that we can avoid this type of error. If anyone needs more info feel free to ask me.

    Read the article

  • How to re-configure graphics from Intel integrated to Intel / ATI switchable?

    - by Bucic
    There are lots of 'how to get switchable graphics to work' guides but I found none on how to configure a system for switchable graphics operation on Ubuntu from the ground up, nor explaining the current driver situation for particular computer models (integrated+discrete combinations). Example: https://help.ubuntu.com/community/HybridGraphics My system being mature and on Intel integrated card also makes things complicated. System information: Ubuntu 12.04 amd64, installed clean with system configured to use only the integrated Intel card Lenovo Thinkpad T500 Intel GMA 4500MHD / ATI Mobility Radeon HD 3650 Current situation: Mature and up-to-date system with no configuration changes to what's given above. I've made a backup image of the system (Clonezilla) so regardless of what's written below let's assume it's our starting point. If something in What I have already tried is not clear you may as well diregard it. What I have already tried: Configuring BIOS to switchable graphics and: Installing Additional Hardware drivers - returned an error. Installing proprietary amd-driver-installer-12.6-legacy-x86.x86_64.run automatically - system starts to 'low-graphics mode'. Tried fixing as per https://help.ubuntu.com/community/BinaryDriverHowto/ATI#Manually_installing_Catalyst_12.6.2C_special_case_for_Intel.2BAC8-ATI_hybrid_graphics Got lost, gave up. Please note that while configuring BIOS for integrated graphics only is pretty straightforward, configuring for switchable graphics is not.

    Read the article

  • OpenWorld on your iPad and iPhone

    - by KLaker
    Most of you probably know that each year I publish a data warehouse guide for OpenWorld which contains links to the latest data warehouse videos, a calendar for the most important sessions and labs and a section that provides profiles and relevant links for all the most important data warehouse presenters. For this year’s conference made all this information available in an HTML app that runs on most smartphones and tablets. The pictures below show the HTML app running on iPad and iPhone. This exciting new web app contains information about why you should attend OpenWorld - just in case you have not yet booked your ticket! - as well as the following information: Getting to know 12c - a series of video interviews with George Lumpkin, Vice President of Data Warehouse Product Management Your presenters - full biographies and links to social media sites for all the key data warehouse presenters Must sees sessions - list of all the most important data warehouse presentations at this year’s conference Our customers - profiles our most important data warehouse customers Must attend labs - list of all the most important data warehouse hands-on labs at this year’s conference Links - a list of links to the most important data warehouse sites If you want to run these web apps on your smartphone and/or tablet then follow these links: iPhone - https://876d5e65b7768ca57d1fd1236578c9374b1fca87.googledrive.com/host/0Bz-zGlWahRf4OXNzejBiRFV5ZXc/iPhone-DWoow2014.html iPad - https://876d5e65b7768ca57d1fd1236578c9374b1fca87.googledrive.com/host/0Bz-zGlWahRf4OXNzejBiRFV5ZXc/iPad-DW2014.html Android users: I have tested the app on Android and there appears to be a bug in the way the Chrome browser displays iframes because scrolling does not work . The app does work correctly if you use either the Android version of the Opera browser or the standard Samsung browser. If you have any comments about the app (content you would like to see) then please let me know. Enjoy OpenWorld.

    Read the article

  • Microsoft C# Most Valuable Professional

    - by Robz / Fervent Coder
    Recently I was awarded the Microsoft Most Valuable Professional (MVP) for Visual C#. For those that don’t know it’s an annual award based on nominations from peers and Microsoft. Although there are just over 4,000 MVPs worldwide from all kinds of specializations, there are less than 100 C# MVPs in the US. There is more information at the site: https://mvp.support.microsoft.com The Microsoft MVP Award is an annual award that recognizes exceptional technology community leaders worldwide who actively share their high quality, real world expertise with users and Microsoft. With fewer than 5,000 awardees worldwide, Microsoft MVPs represent a highly select group of experts. MVPs share a deep commitment to community and a willingness to help others. To recognize the contributions they make, MVPs from around the world have the opportunity to meet Microsoft executives, network with peers, and position themselves as technical community leaders. Here is my profile: https://mvp.support.microsoft.com/default.aspx/profile/rob.reynolds I want to thank those that nominated me, without nominations this would never have happened. Thanks to Microsoft for liking me and finding my achievements and contributions to the community to be worth something. It’s good to know when you put in a lot of hard work that you get rewarded! I also want to thank many of the people I have worked with over the last 7 years. You guys have been great and I’m definitely standing on the shoulders of giants! Thanks to KDOT for giving me that first shot into professional programming and the experience and all of the training! A special thanks to @drusellers for kick starting me when I went stale in my learning back in 2007 and for always pushing me and bouncing ideas off of me. Without you I don’t think I would have made it this far. Thanks Alt.NET for keeping it fresh and funky! A very special thank you goes out to my wife for supporting me and locking me in the basement to work on all of my initiatives!

    Read the article

  • Setting up Google Analytics for multiple subdomains

    - by Andrew G. Johnson
    so first here's a snippet of my current Analytics javascript: var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-30490730-1']); _gaq.push(['_setDomainName', '.apartmentjunkie.com']); _gaq.push(['_setSiteSpeedSampleRate', 100]); _gaq.push(['_trackPageview']); (function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0];s.parentNode.insertBefore(ga, s); })(); So if you wanna have a quick peak at the site the url is ApartmentJunkie.com, keep in mind the site is pretty bare bones but you'll get the idea -- basically it's very similar to craigslist in the sense that it's in the local space so people pick a city then get sent to a subdomain that is specific for that city, e.g. winnipeg.mb.apartmentjunkie.com. I put that up late last night then had a look at the analytics and found that I am seeing only the request uri portion of the URLs in analytics as I would with any other site only with this one it's a problem as winnipeg.mb.apartmentjunkie.com/map/ and brandon.mb.apartmentjunkie.com/map/ are two different pages and shouldn't be lumped together as /map/ I know the kneejerk response is likely going to be "hey just setup a different google analytics profile for each subdomain" but there will eventually be a lot of subdomains so google's cap of 50 is going to be too limited and even more important I want to see the data in aggregate for the most part. I am thinking of making a change to the javascript, to something like: _gaq.push(['_trackPageview',String(document.domain) + String(document.location)]); But am unsure if this is the best way and figured someone else on wm.se would have had a similar situation that they could talk a bit about.

    Read the article

  • First Look - Oracle Data Mining

    - by kimberly.billings
    In his blog, JT on EDM, James Taylor shares his analysis of Oracle Data Mining, including its new GUI and Exadata integration. While Oracle Data Mining has been available for a while, it is now easier to access and try via the Amazon Cloud. Using the Oracle 11gR2 Data Mining Amazon Machine Image (AMI), you can launch an Oracle Data Mining-enabled instance directly through Amazon Web Services (AWS) and connect to it using the Oracle Data Miner graphical user interface. The new Oracle Data Mining GUI, which will be available to beta customers soon, provides more graphics, the ability to define, save and share analytical "work flows" to solve business problems, and provides more automation and simplicity. Taylor comments that, "the UI looks to have a nice look and feel including graphical model development flows, easy access to the data, nice little micro graphs when browsing data records and more." On using Oracle Data Mining with Exadata, Taylor writes, "Oracle says that the use of the ODM routines in the Exadata kernel is faster than running a native ODM model in the database by a factor of 2 and that this increases as more joins are used. This could mean that ODM outperforms even third party in-database analytics." Taylor concludes his blog with a positive overall review, stating that "ODM is a nice product for Oracle database customers and well worth looking into. The new UI will only make it more so." Read the blog. var gaJsHost = (("https:" == document.location.protocol) ? "https://ssl." : "http://www."); document.write(unescape("%3Cscript src='" + gaJsHost + "google-analytics.com/ga.js' type='text/javascript'%3E%3C/script%3E")); try { var pageTracker = _gat._getTracker("UA-13185312-1"); pageTracker._trackPageview(); } catch(err) {}

    Read the article

  • Sending Emails via Google SMTP - after some time quit working

    - by Chris
    on a website I use PHPMailer to send automated registration emails, etc and also a newsletter-tool (which loops through the emails and sends them one by one). Also, I configured in Gmail under Settings and confirmed @mydomain addresses, so I can send from @mydomain emails without the gmail address being displayed. Furthermore I authorized the website to send mails with this link: https://accounts.google.com/DisplayUnlockCaptcha Now, after 2 month where everything worked perfectly fine, suddenly users started not to receive emails anymore and most recently emails are not even being sent anymore. Also, I received many error messages like this: Technical details of permanent failure: Google tried to deliver your message, but it was rejected by the recipient domain. We recommend contacting the other email provider for further information about the cause of this error. The error that the other server returned was: 550 550 5.4.1 [email protected]: Recipient address rejected: Access Denied (state 13). When I check at this link: https://toolbox.googleapps.com/apps/checkmx/ It tells 2 none critical errors: Relayhost configuration detected. There SHOULD be a valid SPF record. So, the questions I would have were: does anybody have any hint why it stopped working, what the error messages mean? what to do to fix it? where do I set a SPF record (Cpanel?)? what is a relayhost and how to fix that? It is about 1000-1400 mails a day (gmail's limit is 2000). Also, what can I do wrong when setting up an SPF record? I've heard there are some testing tools for that. Thank you so much already in advance for your help!

    Read the article

  • Referral traffic not appearing properly in Google Analytics

    - by Crashalot
    We have a partnership arrangement with another site where we pay them for users sent to us. However, they claim our referral numbers for them are lower than theirs by 50%. They are tracking clicks in Google Analytics (using events) while we are using visits in Google Analytics. Are we doing something wrong with our Google Analytics installation? <!-- Google Analytics BEGIN --> <script type="text/javascript"> var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-12345678-1']); _gaq.push(['_setDomainName', 'example.com']); _gaq.push(['_trackPageview']); (function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })(); </script> <!-- Google Analytics END -->

    Read the article

  • Analytics: Test events not showing up - how to troubleshoot?

    - by David Parks
    I've got 3 profiles: Master, Raw Data, and Test, on the Test profile I have no filters configured. I want to test using some events. I created a local HTML file as shown below to generate some test data that I could play with in Analytics. But the events never showed up in Analytics. I wonder what I might be doing wrong? Is the lack of a domain an issue maybe? <html><head></head><body>Login_popup_complete_Facebook <script type="text/javascript"> var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-28554309-1']); _gaq.push(['_trackPageview']); _gaq.push(['_trackEvent', 'Login popup completed', 'Facebook']); (function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })(); </script> </body></html>

    Read the article

  • Simple ADF page using BAM Data Control

    - by [email protected]
    var gaJsHost = (("https:" == document.location.protocol) ? "https://ssl." : "http://www."); document.write(unescape("%3Cscript src='" + gaJsHost + "google-analytics.com/ga.js' type='text/javascript'%3E%3C/script%3E")); try { var pageTracker = _gat._getTracker("UA-15829414-1"); pageTracker._trackPageview(); } catch(err) {} Purpose : In this blog I will walk you through very simple steps to create an ADF page using BAM data control connection.Details : Create the projectOpen JDeveloper (make sure you have installed the SOA extension for JDev)Create new Application using "Generic Application" template.Click on "Next"Shuttle  "ADF Faces" to right pane for the project technology.Click "Finish"Create a BAM connectionIn the resource palette click on "Folder->New Connection -> BAM"Enter the connection name and click "Next"Enter Connection details Click on "Test connection" and "Finish"Create the BAM Data Control Open the IDE connection created in above step.Drag and drop "Employees" to "Data controls" palette.Select "Flat Query" and Click "Finish".Create the View Create a new JSF page.From Data control Panel drag and drop "Employees->Query->ADF Read Only table"Right click and Run the page.

    Read the article

  • extra configuration needed after installing SSL certificate?

    - by ptriek
    We recently developed two rather simple PHP applications for AXA (European bank). URL's are axa.tfo.be/incentives/cipres and axa.tfo.be/incentives/zrkk (access to both sites is restricted to visitors with cookies with encrypted passwords) On a previous security audit by an external company several security issues have been found. All these issues have been solved by a collleague PHP developer. However, one last requirement has been added - all data should be transfered over https. My php collegue is on holiday, however - and unavailable at the moment. So I contacted my host, and asked for installing SSL certificate. I myself have no knowledge/experience with SSL, so I'm a bit at loss for the following problems. Comodo SSL certificate + unique IP address has been installed today by my webhost for subdomain axa.tfo.be (by www.combell.be). However, it doesn't seem to be working. I posted a question about this earlier today, and was told not to worry, see link: http://serverfault.com/questions/339320/what-happens-if-you-install-an-ssl-certificate Current problems: the web applications aren't accessible over https, http works though (if a valid cookie is available) there's a static html page at http://axa.tfo.be/incentives/cipres/static.html, even that page is only accessible over http My webhost is telling me that 'my application probably doesn't support SSL', and has asked me to set an SSL variable to true in my php code. So my questions: I have basic knowledge of php, but don't know where to start regarding the 'php ssl variable'. The sites have been online for some time, and have been developed for regular php access. (Google didn't bring me any help, either.) Can anyone point me in the right direction, or give me some clues about whether/what I should ask my webhost for further assistance? (I'm a bit on a tight schedule, the sites will be audited again on monday, and it's a customer i wouldn't want to loose...) Thanks for looking into this, and sorry if my questions sound a bit nooby - I'm a webdesigner, not a server specialist...

    Read the article

  • Colour coding of the status bar in SQL Server Management Studio - Oh dear

    - by simonsabin
    The new feature in SQL Server 2008 to have your query window status bar colour coded to the server you are on is great. Its a nice way to distinguish production from development servers. Unfortunately it was pointed out to me by a client recently that it doesn't always work. To me that sort of makes it pointless. Its a bit like having breaks that work some of the time. Are you going to place Russian roulette every time you execute the query. Whats more the colour doesn't change if you change the connection. So you can flip between dev and production servers but your status bar stays the colour you set for the dev server. It really annoys me to find features that sort of work. The reason I initially gave up on SQLPrompt was that it didn't work 100% of the time and for that time it didn't work I wasted so much time trying to get it to work I wasted more time than if I didn't have it. (I will say that was 2-3 years ago). If you would like to use this feature but aren't because of these features please vote on these bugs. https://connect.microsoft.com/SQLServer/feedback/details/504418/ssms-make-color-coding-of-query-windows-work-all-the-time https://connect.microsoft.com/SQLServer/feedback/details/361832/update-status-bar-colour-when-changing-connections  

    Read the article

  • ZenGallery: a minimalist image gallery for Orchard

    - by Bertrand Le Roy
    There are quite a few image gallery modules for Orchard but they were not invented here I wanted something a lot less sophisticated that would be as barebones and minimalist as possible out of the box, to make customization extremely easy. So I made this, in less than two days (during which I got distracted a lot). Nwazet.ZenGallery uses existing Orchard features as much as it can: Galleries are just a content part that can be added to any type The set of photos in a gallery is simply defined by a folder in Media Managing the images in a gallery is done using the standard media management from Orchard Ordering of photos is simply alphabetical order of the filenames (use 1_, 2_, etc. prefixes if you have to) The path to the gallery folder is mapped from the content item using a token-based pattern The pattern can be set per content type You can edit the generated gallery path for each item The default template is just a list of links over images, that get open in a new tab No lightbox script comes with the module, just customize the template to use your favorite script. Light, light, light. Rather than explaining in more details this very simple module, here is a video that shows how I used the module to add photo galleries to a product catalog: Adding a gallery to a product catalog You can find the module on the Orchard Gallery: https://gallery.orchardproject.net/List/Modules/Orchard.Module.Nwazet.ZenGallery/ The source code is available from BitBucket: https://bitbucket.org/bleroy/nwazet.zengallery

    Read the article

  • HTML5 Development for Dummies

    - by Geertjan
    What's HTML5 all about and what does it actually mean, concretely, to develop HTML5 applications? NetBeans IDE 7.3 provides something called "Project Easel", which is a bundling of HTML5-related tools into a coherent toolset. Within a matter of hours, you'll know everything you need to know about what all this is about if you follow the steps below.  Get A Solid Overview. Start by viewing this screencast from JavaOne 2012 (click the media link on the right side once you've clicked the link below, a downloadable MP4 file is also available there):https://oracleus.activeevents.com/connect/sessionDetail.ww?SESSION_ID=4038That is an awesome way to get you in the right mindframe for what HTML5 is and how it fits into the programming world, together with a very cool and entertaining demo, presented by JB Brock. He starts with about three slides and then does a super awesome demo that puts you into the picture very quickly. Understand How HTML5 Relates To Java EE. Now here's a very cool follow up to the above, again demo-driven (click the media links on the right side once you've clicked the link below):https://oracleus.activeevents.com/connect/sessionDetail.ww?SESSION_ID=4737David Konecny takes the Affable Bean project created via the NetBeans E-commerce Tutorial and creates an HTML5 front end for it! I.e., you are shown how HTML5 can provide a different front end, as an alternative to JSF. Why would you do that? Well, that's explained in David's session, as well as in JB Brock's session, i.e., choose the right technology for the right situation. Sometimes HTML5 might make sense, other times JSF might make sense. Follow The NetBeans Screencasts. To revise and firm up everything you've learned from the above two JavaOne sessions, watch two screencasts by Ken Ganfield, part 1, Getting Started with HTML5 and part 2, Working with JavaScript in HTML5 Applications. In particular, you'll learn how NetBeans IDE provides tools to thoroughly cover the needs of HTML5 developers. Having taken the above three steps, you now have a thorough background, together with an understanding of the tools and procedures needed for creating your own HTML5 applications.

    Read the article

  • How do I compile a Wikipedia lens and install?

    - by user49523
    I read a tutorial about how to compile and install a Wikipedia lens, but it didn't work. The tutorial sounds easy - i just copied and pasted to the file that was suppose to edit. I have tried some times and here are 2 edits edit 1: import logging import optparse import gettext from gettext import gettext as _ gettext.textdomain('wikipedia') from singlet.lens import SingleScopeLens, IconViewCategory, ListViewCategory from wikipedia import wikipediaconfig import urllib2 import simplejson class WikipediaLens(SingleScopeLens): wiki = "http://en.wikipedia.org" def wikipedia_query(self,search): try: search = search.replace(" ", "|") url = ("%s/w/api.php?action=opensearch&limit=25&format=json&search=%s" % (self.wiki, search)) results = simplejson.loads(urllib2.urlopen(url).read()) print "Searching Wikipedia" return results[1] except (IOError, KeyError, urllib2.URLError, urllib2.HTTPError, simplejson.JSONDecodeError): print "Error : Unable to search Wikipedia" return [] class Meta: name = 'Wikipedia' description = 'Wikipedia Lens' search_hint = 'Search Wikipedia' icon = 'wikipedia.svg' search_on_blank=True # TODO: Add your categories articles_category = ListViewCategory("Articles", "dialog-information-symbolic") def search(self, search, results): for article in self.wikipedia_query(search): results.append("%s/wiki/%s" % (self.wiki, article), "http://upload.wikimedia.org/wikipedia/commons/6/63/Wikipedia-logo.png", self.articles_category, "text/html", article, "Wikipedia Article", "%s/wiki/%s" % (self.wiki, article)) pass edit 2: import urllib2 import simplejson import logging import optparse import gettext from gettext import gettext as _ gettext.textdomain('wikipediaa') from singlet.lens import SingleScopeLens, IconViewCategory, ListViewCategory from wikipediaa import wikipediaaconfig class WikipediaaLens(SingleScopeLens): wiki = "http://en.wikipedia.org" def wikipedia_query(self,search): try: search = search.replace(" ", "|") url = ("%s/w/api.php?action=opensearch&limit=25&format=json&search=%s" % (self.wiki, search)) results = simplejson.loads(urllib2.urlopen(url).read()) print "Searching Wikipedia" return results[1] except (IOError, KeyError, urllib2.URLError, urllib2.HTTPError, simplejson.JSONDecodeError): print "Error : Unable to search Wikipedia" return [] def search(self, search, results): for article in self.wikipedia_query(search): results.append("%s/wiki/%s" % (self.wiki, article), "http://upload.wikimedia.org/wikipedia/commons/6/63/Wikipedia-logo.png", self.articles_category, "text/html", article, "Wikipedia Article", "%s/wiki/%s" % (self.wiki, article)) pass class Meta: name = 'Wikipedia' description = 'Wikipedia Lens' search_hint = 'Search Wikipedia' icon = 'wikipedia.svg' search_on_blank=True # TODO: Add your categories articles_category = ListViewCategory("Articles", "dialog-information-symbolic") def search(self, search, results): # TODO: Add your search results results.append('https://wiki.ubuntu.com/Unity/Lenses/Singlet', 'ubuntu-logo', self.example_category, "text/html", 'Learn More', 'Find out how to write your Unity Lens', 'https://wiki.ubuntu.com/Unity/Lenses/Singlet') pass so .. what can i change in the edit ? (if anybody give me the entire edit file edited i will appreciate)

    Read the article

  • Configuring LiveID authentication with SharePoint2010

    - by ybbest
    With the addition of the new claims based authentication framework in SharePoint 2010, SharePoint is now more loosely coupled to the authentication layer than ever. You’ve probably seen presentations or webinars where it was mentioned that you can use claims authentication against authentication providers such as Live ID and OpenID. In this blog I will show you the common problems while you configure you LiveID integration with SharePoint2010.The detailed configuration can be found in the following blogs. Part 1 – http://www.wictorwilen.se/Post/Visual-guide-to-Windows-Live-ID-authentication-with-SharePoint-2010-part-1.aspx Part 2 – http://www.wictorwilen.se/Post/Visual-guide-to-Windows-Live-ID-authentication-with-SharePoint-2010-part-2.aspx Part 3 – http://www.wictorwilen.se/Post/Visual-guide-to-Windows-Live-ID-authentication-with-SharePoint-2010-part-3.aspx Here are some problems I have following the instructions: Problem 1: If you had the following exceptions when you run the PowerShell scripts to create the new LiveID authentication provider New-SPTrustedIdentityTokenIssuer : Exception of type ‘System.ArgumentException’ was thrown.Parameter name: claimType At line:1 char:42 + $authp = New-SPTrustedIdentityTokenIssuer <<<< -Name “LiveID INT” -Description “LiveID INT” -Realm $realm -ImportTrustCertificate $certfile -ClaimsMappings $emailclaim,$upnclaim -SignInUrl “https://login.live-int.com/login.srf” -IdentifierClaim $emailclaim.InputClaimType + CategoryInfo : InvalidData:(Microsoft.Share…dentityProvider:SPCmdletNewSPIdentityProvider) [New-SPTrustedIdentityTokenIssuer], ArgumentException + FullyQualifiedErrorId :Microsoft.SharePoint.PowerShell.SPCmdletNewSPIdentityProvider Solution: You need to Remove the existing the SPTrustedIdentityTokenIssuer.     1. You need to first get the existing TokenIssuer name by Get-SPTrustedIdentityTokenIssuer, and then run Remove- SPTrustedIdentityTokenIssuer to remove the existing TokenIssuer.     2. After that , you can re-run the script , everything should work fine now. Problem 2: Live INT automatically logs out Whenever I try to log in (https://login.live-int.com/login.srf), after entering valid email/password I get redirected to the logout page. Solution: You can find the solution in my previous blog.

    Read the article

  • UIView vs CCLayer and Making gestures work in Cocos2d

    - by Lewis
    now I've been been searching for an answer to this question for at least 3 days now. I've tried on many cocos2d forums, including the official one and have heard nothing back. I've found a project which uses custom gestures: https://github.com/melle/OneFingerRotationGestureDemo Explanation here: http://blog.mellenthin.de/archives/2012/02/13/an-one-finger-rotation-gesture-recognizer/ Now I want to implement that behaviour onto a sprite in a cocos2d application. I've tried to do this but it fails to work. It uses a view controller which inherits like this: @interface OneFingerRotationGestureViewController : UIViewController <OneFingerRotationGestureRecognizerDelegate> Now my question is how would I implement the OneFingerRotationGesture behaviour onto a CCSprite in cocos2d 2.0? As interfaces in cocos2d look like this: @interface HelloWorldLayer : CCLayer Now I have asked a similar question to this on stack overflow and a user directed me to this link: https://github.com/krzysztofzablocki/CCNode-SFGestureRecognizers Which I believe makes use of gestures (like the first github linked project) but not custom gestures. I lack the obj-c skills to work out and implement the functionality into my game, so I would appreciate it if someone could explain the differences between CCLayer and UIViewController, and help me implement the OneFingerRotation gesture into a cocos2d 2.0 project. Regards, Lewis.

    Read the article

  • Google analytics iframe code measuring visitor as two visitors

    - by Maarten
    I'm trying to measure visitors in an iframe and the site containing the iframe. What I would like is that visitors clicks in the iframe are seen being from the same visitor as the containing site, but somehow it is seen as two seperate visitors. I followed examples from http://www.blastam.com/blog/index.php/2011/02/google-analytics-cross-domain-tracking/, trimmed down to an even simpler version based on the comments about setDomainName not being needed anymore but with setDomainName I get the same result: a click on a page and a click on the iframe is seen as 2 clicks by 2 seperate visitors. This is the code in my iframe if (_gaq && gaAccount.length > 0){ _gaq.push(['_setAccount', gaAccount]); _gaq.push(['_setAllowLinker', true]); //_gaq.push(['_setDomainName', 'none']); _gaq.push(['_trackPageview', 'mytestcountername']); } And this is the code in the containing page: <script type="text/javascript"> var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-9605474-4']); _gaq.push(['_setAllowLinker', true]); //_gaq.push(['_setDomainName', '.domain.nl']); _gaq.push(['_trackPageview']); (function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })(); </script>

    Read the article

  • apt-get upgrade gives "403 forbidden" error

    - by 3l4ng
    I'm running Ubuntu 13.04 64b. sudo apt-get update works fine, but when I run sudo apt-get upgrade I get these errors: Err http://archive.ubuntu.com/ubuntu/ raring-updates/main python3.3-minimal amd64 3.3.1-1ubuntu5.2 403 Forbidden Err http://ppa.launchpad.net/otto-kesselgulasch/gimp/ubuntu/ raring/main gimp amd64 2.8.6-0raring1~ppa 403 Forbidden Err http://archive.ubuntu.com/ubuntu/ raring-security/main python3.3-minimal amd64 3.3.1-1ubuntu5.2 403 Forbidden Err http://ppa.launchpad.net/otto-kesselgulasch/gimp/ubuntu/ raring/main gimp-help-en all 1:2.8-0raring16~ppa 403 Forbidden Failed to fetch http://archive.ubuntu.com/ubuntu/pool/main/p/python3.3/python3.3-minimal_3.3.1-1ubuntu5.2_amd64.deb 403 Forbidden Failed to fetch http://ppa.launchpad.net/otto-kesselgulasch/gimp/ubuntu/pool/main/g/gimp/gimp_2.8.6-0raring1~ppa_amd64.deb 403 Forbidden Failed to fetch http://ppa.launchpad.net/otto-kesselgulasch/gimp/ubuntu/pool/main/g/gimp-help/gimp-help-en_2.8-0raring16~ppa_all.deb 403 Forbidden E: Unable to fetch some archives, maybe run apt-get update or try with --fix-missing? Running sudo apt-get upgrade --fix-missing installs some updates, but the above errors still persist when I run apt-get upgrade again. The software update app shows the error: https://www.dropbox.com/s/2cr450557hmahzz/software_update.jpg and selecting continue shows: https://www.dropbox.com/s/l7u32sxyfbxxeeg/soft_upd2.jpg (sorry for the links, I don't have enough rep to post images) I am behind a proxy, but apt-get update and web browsing work without issues. I also do not believe a server being down is causing this, as the problem has been there over a month. Any ideas on how to fix this?

    Read the article

  • Google Analytics async=true seems wrong in the Google documentation?

    - by leeand00
    In the Google Analytics async example, they state that in order to include more than one tracker, you need to setup your pages for asyncrous tracking, and they do so using the following code: <script type="text/javascript"> _gaq.push( ['_setAccount', 'UA-XXXXX-1'], ['_trackPageview'], ['b._setAccount', 'UA-XXXXX-2'], ['b._trackPageview'] ); (function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })(); </script> The second tracker is not receiving any results. After looking at my tracking codes to ensure they are correct, I noticed that the ga.async = true statement is specified differently most of the time and is never set to a value of true, it's often set to async but never true. Could this be stopping my Analytics data from posting to the second tracker? or might it be something else? Also what calls should I look for in the Net tab in firebug to ensure that GA is being called when the page loads?

    Read the article

  • IIS cache control header settings

    - by a_m0d
    I'm currently working on a website that is accessed over https. We have recently come across a problem where we are unable to view .pdf files or any other type of file that is sent as an attachment (Content-Disposition:attachment). According to Microsoft Knowledge Base this is due to the fact that Cache-Control is set to no-cache. However, we have a requirement that all pages be fully reloaded every time they are visited, so we have disabled caching on all pages (through our ASP code, not through IIS settings). However, I have made a special case of this one page that shows the attachment, and it now returns a header with Cache-Control:private and the expiry set to 1 minute in the future. This works fine when I test it on my local machine, using https. However, when I deploy it to our test server and try it, the response headers still return Cache-Control:no-cache. There is no firewall or anything between me and the server, so IIS itself must be adding these headers and replacing mine. I have no idea why it would do this, and it doesn't really make any sense, but it seems to be the only option at the moment (I haven't yet found any other place in the code that will change the cache headers). Can anyone point me to a possible place where IIS might be setting these header values?

    Read the article

  • MDX needs a function or macro syntax

    - by Darren Gosbell
    I was having an interesting discussion with a few people about the impact of named sets on performance (the same discussion noted by Chris Webb here: http://cwebbbi.wordpress.com/2011/03/16/referencing-named-sets-in-calculations). And apparently the core of the performance issue comes down to the way named sets are materialized within the SSAS engine. Which lead me to the thought that what we really need is a syntax for declaring a non-materialized set or to take this even further a way of declaring an MDX expression as function or macro so that it can be re-used in multiple places. Because sometimes you do want the set materialised, such as when you use an ordered set for calculating rankings. But a lot of the time we just want to make our MDX modular and want to avoid having to repeat the same code over and over. I did some searches on connect and could not find any similar suggestions so I posted one here: https://connect.microsoft.com/SQLServer/feedback/details/651646/mdx-macro-or-function-syntax Although apparently I did not search quite hard enough as Chris Webb made a similar suggestion some time ago, although he also included a request for true MDX stored procedures (not the .Net style stored procs that we have at the moment): https://connect.microsoft.com/SQLServer/feedback/details/473694/create-parameterised-queries-and-functions-on-the-server Chris also pointed out this post that he did last year http://cwebbbi.wordpress.com/2010/09/13/iccube/ where he pointed out that the icCube product already has this sort of functionality. So if you think either or both of these suggestions is a good idea then I would encourage you to click on the links and vote for them.

    Read the article

< Previous Page | 66 67 68 69 70 71 72 73 74 75 76 77  | Next Page >