Search Results

Search found 7249 results on 290 pages for 'https everywhere'.

Page 51/290 | < Previous Page | 47 48 49 50 51 52 53 54 55 56 57 58  | Next Page >

  • .htaccess URL is being cut at a space (I'm using Codeigniter)

    - by Ice
    I've used the following code to map https://live.example.com on to http://example.com/api Options +FollowSymLinks RewriteEngine On RewriteCond %{HTTPS} =on RewriteCond %{HTTP_HOST} ^live\.example\.com [NC] RewriteRule (.+)$ "http://example.com/api/$1" [L,P] The following url: https://live.example.com/userlinks/xml/John%20James/MvPjeUTo15/ Is suppose to map onto: http://example.com/api/userlinks/xml/John%20James/MvPjeUTo15/ Instead it maps it to: http://example.com/api/userlinks/xml/John So it seems to cut it at the space. I am using the Codeigniter framework but I am not sure if the problem lies there. Also using %20 or just a space in the url bar produces no different results. Anyone any idea and/or solution why this happens? Thank you very much, Ice

    Read the article

  • Flex 3 Regular Expression Problem

    - by Tommy
    I've written a url validator for a project I am working on. For my requirements it works great, except when the last part for the url goes longer than 22 characters it breaks. My expression: /((https?):\/\/)([^\s.]+.)+([^\s.]+)(:\d+\/\S+)/i It expects input that looks like "http(s)://hostname:port/location". When I give it the input: https://demo10:443/111112222233333444445 it works, but if I pass the input https://demo10:443/1111122222333334444455 it breaks. You can test it out easily at http://ryanswanson.com/regexp/#start. Oddly, I can't reproduce the problem with just the relevant (I would think) part /(:\d+\/\S+)/i. I can have as many characters after the required / and it works great. Any ideas or known bugs?

    Read the article

  • Preventing a Firefox Extension's load event from triggering across tabs

    - by lgomez
    Hi, I've been working on a Firefox extension that uses an iFrame to do some background scrapping. I have gone through a number of hoops to get it to trigger only once in the window/tab where it should but now when I open a different tab that tab triggers the load event. I found this: https://developer.mozilla.org/en/Code_snippets/Progress_Listeners but the snippets are not there. This was helpful before but doesn't solve the problem: https://developer.mozilla.org/en/Code_snippets/On_page_load More frustrating is that if there is this: https://developer.mozilla.org/En/Listening_to_events_on_all_tabs why should I worry about the normal event listener triggering in more than one tab? Any ideas or examples?

    Read the article

  • How can I change Twitter's Share button height?

    - by user1035890
    How can I change Twitter's icon height? I have another custom image, but the height stays the same. How do I fix this? https://dev.twitter.com/docs/tweet-button and I used the div method and not the iframe because I wanted to add the data-title I used this code: <script src="//platform.twitter.com/widgets.js" type="text/javascript"></script> <div> <a href="https://twitter.com/share" class="twitter-share-button" data-url="https://dev.twitter.com/pages/tweet_button" data-via="your_screen_name" data-text="Checking out this page about Tweet Buttons" data-related="anywhere:The Javascript API" data-count="vertical">Tweet</a> </div>

    Read the article

  • Apache Shiro, INI-Configuration, Perms per URL: How to get URL params?

    - by Marcus Schultö
    I want to use Apache Shiro[1] in my JSF-Application to perform URL-based authorization checks, configuration done in shiro.ini As I see in the Shiro-documentation[2] there is a way to use a "perms"-filter /remoting/rpc/** = authc, perms["remote:invoke"] In my scenario I want this functionality, but on entity-level[3], where the entity-Id is in the http-request # "Open settings for user with id=123": # /user/settings.xhtml?user_id=123 /user/settings.xhtml = perms["user:update:XXX"] So, how do I do this with Shiro? How to I tell the perms-filter to check for http-params? Or is this supposed to be done in my Realm-Implemenation, concrete by calling FacesContext? [1] https://shiro.apache.org [2] https://shiro.apache.org/web.html#Web-webini [3] This can be done at least programmatically: SecurityUtils.getSubject().isPermitted("printer:query:lp7200") https://shiro.apache.org/permissions.html

    Read the article

  • RequireHttpsAttribute and Encrypted Request Data

    - by goatshepard
    I have a controller action that is accepting sensitive data. public ActionResult TakeSensitiveData(SensitiveData data){ data.SaveSomewhere(); } To ensure the data is secure I want to be certain requests are made using HTTPS (SSLv3, TLS 1). One of the approaches I've considered using was the RequireHttpsAttribute on my action: [RequireHttps] public ActionResult TakeSensitiveData(SensitiveData data){ data.SaveSomewhere(); } However, upon testing this I fiddler revealed that an HTTP request made to the action is 302 redirected to HTTPS. My question is this: If I've made a request that is 302 redirected to HTTPS haven't I already sent the sensitive data over HTTP before the redirect?

    Read the article

  • regexp for detect that the url doesn´t end with an extension

    - by devnieL
    Hello. I'm using this regular expression for detect if an url ends with a jpg : var exp = /(\b(https?|ftp|file):\/\/[-A-Z0-9+&@#\/%?=~_|!:,.;]*[-A-Z0-9+&@#\/%=~_|]*^\.jpg)/ig; it detects the url : e.g. http://www.blabla.com/sdsd.jpg but now i want to detect that the url doesn't ends with an jpg extension, i try with this : var exp = /(\b(https?|ftp|file):\/\/[-A-Z0-9+&@#\/%?=~_|!:,.;]*[-A-Z0-9+&@#\/%=~_|]*[^\.jpg]\b)/ig; but only get http://www.blabla.com/sdsd then i used this : var exp = /(\b(https?|ftp|file):\/\/[-A-Z0-9+&@#\/%?=~_|!:,.;]*[-A-Z0-9+&@#\/%=~_|]*[^\.jpg]$)/ig; it works if the url is alone, but dont work if the text is e.g. : http://www.blabla.com/sdsd.jpg text

    Read the article

  • Question about SSL Certificate.

    - by smwikipedia
    Hi experts, I am trying to make a SSL connection to a web site. Each time I enter the https:// address and press enter, the IE8 prompts me to select the Certificate (Client Certificate) to send to the server. I got 2 certificates to choose from. And they are stored in the IE8 - Internet Options - Content - Certificates - Personal. Since my server and client are the same machine, I want to use a single certificate for both server and client. And this certificate is a IIS generated self signed certificate. I do the following steps: 1- Generate a self-signed-cert in IIS; 2- Bind my site to https and choose the above self-signed-cert 3- Import the self-signed-cert at the IE8 - Internet Options - Content - Certificates - Personal. Then I use the https link to access my page, it is still prompts me to choose a certificate. But I cannot see my newly imported self-signed-cert. Why?

    Read the article

  • anonymous ASP.net form with SSL, sharepoint

    - by user307852
    Hi I want to create contact form with SSL. I have created simple asp.net contact form without ssl and now i must add it. It is in Sharrepoint project but seems to be the same case as in asp.net form. I have anonymous webapplication and won't be any login usecase, so whole webapplication must work via http:// but when user go to contact form, it must work via https:// I know how to do the redirect to https:// programatically, I've been searching how to configure SSL on IIS but it seems to not be the case?? I don't wont whole webappliation to work via https, only my contact form - how to do that and how o onfigure that? The data from my form will be passed to database, but it is not important here.

    Read the article

  • Make Codeigniter ignore directory

    - by Noah Goodrich
    I have Codeigniter installed and working for my main site. But I am now trying to add an add-on domain to the same hosting account, so I can have two sites running on the same hosting. Add-on domains make a new folder in the main public_html folder to store the web files. How can I get Codeigniter to ignore this directory? The site doesn't load properly when I try and view it. I have an SSL on the main site too and redirection for www URLS. Here's my .htaccess file: RewriteEngine on Options +FollowSymLinks RewriteBase / RewriteCond %{HTTP_HOST} ^www\.mysite\.co.uk$ [NC] RewriteRule ^(.*)$ http://mysite.co.uk/$1 [L,R=301] RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule ^(.*)$ /index.php/$1 RewriteCond %{HTTPS} off RewriteCond %{REQUEST_URI} (site|sections|here) RewriteRule ^(.*)$ https://%{SERVER_NAME}%{REQUEST_URI} [R=301,L] RewriteCond %{HTTPS} onsite|sections|here) RewriteRule ^(.*)$ http://%{SERVER_NAME}%{REQUEST_URI} [R=301,L]

    Read the article

  • PHP getting full server name including port number and protocol

    - by vivid-colours
    In PHP, is there a reliable and good way of getting these things: Protocol: i.e. http or https Servername: e.g. localhost Portnumber: e.g. 8080 I can get the server name using $_SERVER['SERVER_NAME']. I can kind of get the protocol but I don't think it's perfect: if(strtolower(substr($_SERVER["SERVER_PROTOCOL"],0,5))=='https') { return "https"; } else { return "http"; } I don't know how to get the port number though. The port numbers I am using are not 80.. they are 8080 and 8888. Thank you.

    Read the article

  • Git 1.7.10 asks me for github username and password

    - by Daniel Ruf
    Since I have the new version it doesnt ask me anymore for the password I set in my ssh key file. It asks now directly for a github username and password when I push every time. Is this a new feature of git or changed it in the past or is there something what changed on github? I tried to authenticate using ssh and the email and password from my ssh ke file and it worked. Github changed to smartftp and also changed the instructions for setting up repos https://github.com/blog/1104-credential-caching-for-wrist-friendly-git-usage https://help.github.com/articles/create-a-repo Saw it later, they use now https instead of the git protocol

    Read the article

  • How can I fix this regex to allow a specific string?

    - by Sailing Judo
    This regex comes from Atwood and is used to filter out anchor tags with anything other than the href and a title: <a\shref="(\#\d+|(https?|ftp)://[-A-Za-z0-9+&@#/%?=~_|!:,.;]+)"(\stitle="[^"]+")?\s?> I need to allow am additional attribute that specifically matches: target="_blank". So the following url should be allowed: <a href="http://www.google.com" target="_blank"> I tried changing the pattern to these: <a\shref="(\#\d+|(https?|ftp)://[-A-Za-z0-9+&@#/%?=~_|!:,.;]+)"(\stitle="[^"]+")(\starget="_blank")?\s?> <a\shref="(\#\d+|(https?|ftp)://[-A-Za-z0-9+&@#/%?=~_|!:,.;]+)"(\stitle="[^"]+")(\starget=\"_blank\")?\s?> Clearly I don't know regex very well. How should the pattern be adjusted to allow the blank target and no other targets?

    Read the article

  • Using .htaccess to rewrite dynamic subdomains

    - by brokekidweb
    I'm currently using .htaccess to rewrite on my website to create dynamic subdomains. I also use it to remove the .php extension on all pages. However, once inside the subdomain, it tries to redirect again. For example, if you went to https://admin.example.com/test, it would actually be accessing https://example.com/clients/admin/test.php. I keeping getting various 404 errors using the following .htaccess file: Options +MultiViews Options +FollowSymLinks RewriteEngine On RewriteRule ^subdomains/(.*)/(.*) http://$1.example.com/$2 [r=301,nc] RewriteCond %{REQUEST_FILENAME} !-f RewriteRule ^([^\.]+)$ $1.php [NC,L] RewriteCond %{HTTP_HOST} !^(www\.)?example\.com$ [NC] RewriteCond %{HTTP_HOST} ^(www\.)?([^\.]+)\.example\.com$ [NC] RewriteCond %{DOCUMENT_ROOT}/%2%{REQUEST_URI}/ -d RewriteRule [^/]$ %{REQUEST_URI}/ [R=301,L] RewriteCond %{ENV:REDIRECT_STATUS} ^$ RewriteCond %{HTTP_HOST} !^(www\.)?example\.com$ [NC] RewriteCond %{HTTP_HOST} ^(www\.)?([^\.]+)\.example\.com$ [NC] RewriteRule ^(.*)$ clients/%2/$1 [QSA,L] How can I keep this from redirecting to https://admin.example.com/clients/admin/test.php?

    Read the article

  • Realtek HD Audio playing weird with certain video formats

    - by dyasny
    Hi, I have a Gigabyte motherboard with an onboard Realtek HD sound card. The card is working perfectly everywhere, except for a single video format, where the voice is distorted, sounds as if it's been passed through a metal tube. Been googling for this, but couldn't find an answer anywhere. The movie plays fine on other systems (got Linux everywhere else), but on this one (winXP-x64-sp2) it just doesn't. Here are some details: MPC: Type: KLCP WMV File Audio: 0x000a 22050Hz mono 20Kbps [Raw Audio 0] Video: Windows Media Video 9 400x300 29.97fps 227Kbps [Raw Video 1] VLC: Codec: wmas Sample rate: 22050 Bits per sample: 16 Bitrate: 20kb/s

    Read the article

  • Access denied errors in windows server 2008 for system administrator

    - by NLV
    Hello I've a HP Pavilion DV4 with windows Server 2008 R2 enterprise. My system is connected to a domain. I login using my domain account. I am the administrator for the local machine. But I'm getting access denied everywhere (file system, running commands, using visual studio..literally everywhere..) in my machine. What could be the problem? Any ideas? Please tell me if you need more information as I'm just a developer with less knowledge in system administration.

    Read the article

  • Restrict only some plugins to specific sites in Google Chrome

    - by Christian
    I am looking for a way to set up Google Chrome so that it will run a certain plug-in (Java, what else?) only on whitelisted sites, but other plug-ins (like the PDF viewer) everywhere. From playing with the policies available for Chrome, I think there are basically two levels of plug-in management: List of disabled plugins/enabled plugins: Controls whether a plug-in exists for the browser at all This pair of policies applies to plug-ins, but not to sites. Default plug-in settings/Allow plug-ins on sites: Controls on which sites plug-ins can run This set of policies applies to sites, but not to individual plugins, and it cannot override the first pair. There appears to be no way to configure Chrome so that some plug-ins only run on whitelisted sites, but others run everywhere by default. I have also looked at filtering content on the firewall/proxy level, but I'm not convinced it can be done securely there. Filtering by URLs (file names) or content types can be circumvented trivially, and identification by content inspection cannot be safe either.

    Read the article

  • Mimicing Mac-style command/alt/control keys in Linux

    - by Kenrick Rilee
    I absolutely love that Mac separates the command key from the control key, allowing OS shortcuts and text shortcuts to co-exist. It's incredibly useful, especially because it allows emacs shortcuts everywhere. I've searched almost everywhere for some kind of utility that can allow this and can't find anything. Any help? Note: I want to do more than just remap my keyboard. I want to actually split OS shortcuts and text shortcuts. The only way I can see doing that is to manually go through each shortcut in Gnome and Compiz and change it.

    Read the article

  • Best Practices vs Reality

    - by RonHill
    On a scale depicting how closely best practices are followed, with "always" on one end and "never" on the other, my current company falls uncomfortably close to the latter. Just a couple trivial examples: We have no code review process There is very little documentation despite a very large code base (and some of it is blatantly incorrect/misleading) Untested/buggy/uncompilable code is frequently checked in to source control It is comically complicated to create a debuggable build for some of our components because of its underlying architecture. Unhandled exceptions are not uncommon in our releases Empty Catch{ } blocks are everywhere. Now, with the understanding that it's neither practical nor realistic to follow ALL best practices ALL the time, my question is this: How closely have commonly accepted best practices been followed at the companies you've worked for? I'm kind of a noob--this is only the second company I've worked for--so I'm not sure if I'm just more of an anal retentive coder or if I've just ended up at mediocre companies. My guess (hope?) is the latter, but a coworker with way more experience than me says every company he's ever worked for is like this. Given the obvious benefits of following most best practices most of the time, I find it hard to believe it's like this everywhere. Am I wrong?

    Read the article

  • Smarty: Configurable Comments and Code Templates

    - by Martin Fousek
    Hello, today we would like to show you few improvements we have prepared in PHP Smarty Framework for NetBeans 7.3. So let's talk about adjustable toggle comment action and code templates. Configurable Comments As some of you requested we implemented toggle comment action with adjustable behavior. In NetBeans 7.3 you can choose in Options between commenting as a "Smarty comments everywhere" or "Language sensitive comments" in Smarty Templates. Toggle comment language sensitive: Toggle comment as Smarty comment everywhere: Code Templates In NetBeans 7.3 we will provide by default many code templates inside Smarty templates or directly inside Smarty tags. Available should be code templates for all built-in or custom functions and modifiers of Smarty 3.x. Besides that you should be able to define additional custom templates easily in Options -> Editor -> Code Templates for "Smarty Templates" or directly for "Smarty Markup" (which means code templates inside Smarty tag). You can also take advantage of selection's template which are able to wrap your code with chosen Smarty tag. That's all for today. As always, please test it and report all the issues or enhancements you find in NetBeans BugZilla (component php, subcomponent Smarty).

    Read the article

  • Cannot access personal website from home IP. More details inside.

    - by GX67
    This is a recent problem I've been having. My site can be accessed from almost everywhere else except from my home IP, where I do most of my editing/updating, etc. I've tested my connection from my school's network, a friend's connection from out of state (multiple states), and through a tethered connection with my friend's Android. It works in all those cases, both viewing, accessing the cPanel, and using FTP. Here's the problem that happens to me when I try to view it from my home IP: The page times out in Firefox, IE, and Chrome. Using the cmd, I ran tracert and ping, both as failed attempts. Log here. downforeveryoneorjustme.com says my site is up. So do the other site checkers. I can't access my cPanel or FTP accounts. I can't access the host site. (I use perfectz.info for hosting, and I can't access their site either.) System settings: No firewall enabled. Ports are seemingly properly forwarded. (e.g. The ports are open in the router settings, and are open everywhere else.) I have an email forwarder set up from the cPanel that works just fine. (i.e. I can receive emails sent to that address. If any other information is needed, I'll do my best to provide it. UPDATE @ilhan: I use two things: 1) The site cPanel from in-browser. 2) Dreamweaver CS5 FTP. @Matthias: I tested both, and it passes the dual stack with a 10/10. What should I do then?

    Read the article

  • Creating a bootable flash without overlayfs

    - by Septagram
    I want to create an USB stick to carry my Ubuntu everywhere around with me. It's not intended to spread Ubuntu by installing it everywhere, but rather for running my configured system on any computer I come across. So far, I went with installing Ubuntu with unetbootin, however, I have some issues with this. When installed with netbootin, the original disk image is kept intact on the flash drive, forever. Also, a file is created for persistent storage and during boot it is accessed together with the image by overlayfs. This, in my opinion, has the following problems: If system is updated regularly, then files from the image are overwritten in persistent storage, doubling their size and wasting precious space. Persistent storage has a fixed size that you have to define from the start, again, wasting precious space. I'm not 100% sure, but maybe using overlayfs makes disk access slower, and more so on the relatively slow devices. So I'd like to find another solution: either to get rid of the original image or to install Ubuntu "normally" on the separate ext2 partition, or maybe even install it in the main vfat partition on the USB stick. Suggestions?

    Read the article

  • No MAU required on a T4

    - by jsavit
    Cryptic background One of the powerful features of the T-series servers is its hardware crypto acceleration, which dramatically speeds up the compute intensive algorithms needed to encrypt and decrypt data. Previously, administrators setting up logical domains on older T-series servers had to explicitly assign crypto resources (called "MAU" for historical reasons from the T1 chip that had "modular arithmetic units") to domains that had a significant crypto workload (say, an SSL based web server). This could be an administrative burden, as you had to choose which domains got the crypto units, and issue the appropriate ldm set-mau N mydomain commands. The T4 changes things The T4 is fast. Really fast. Its clock rate and out-of-order (OOO) execution that provides the single-thread performance that T-series machines previously did not have. If you have any preconceptions about T-series performance, or SPARC in general, based on the older servers (which, it must be said, were absolutely outstanding for multi-threaded applications), those assumptions are now obsolete. The T4 provides outstanding. performance for all kinds of workload, as illustrated at https://blogs.oracle.com/bestperf. While we all focused on this (did I mention the T4 is fast?), another feature of the T4 went largely unnoticed: The T4 servers have crypto acceleration "just built in" so administrators no longer have to assign crypto accelerator units to domains - it "just happens". This is way way better since you have crypto everywhere by default without having to manage it like a discrete and limited resource. It's a feature of the processor, like doing an integer add. With T4, there is no management necessary, you just have HW crypto everywhere all the time seamlessly. This change hasn't been widely advertised, and some administrators have wondered why there were unable to assign a MAU to a domain as they did with T2 and T3 machines. The answer is that there is no longer any separate MAU, so you don't have to take any action at all - just leave the default of 0. Summary Besides being much faster than its predecessors, the T4 also integrates hardware crypto acceleration so its seamlessly available to applications, whether domains are being used or not. Administrators no longer have to control how they are allocated - it "just happens"

    Read the article

< Previous Page | 47 48 49 50 51 52 53 54 55 56 57 58  | Next Page >