Search Results

Search found 47251 results on 1891 pages for 'web storage'.

Page 881/1891 | < Previous Page | 877 878 879 880 881 882 883 884 885 886 887 888  | Next Page >

  • What are possible reasons why a calendar entry in OWA is at a different time than in Outlook?

    - by Ken Pespisa
    We have two Exchange 2003 servers, our primary server and a front-end server that hosts Outlook Web Access (OWA). When I open my boss' calendar via Outlook 2007 (from my Outlook client as well as hers) I see the event scheduled for 10:30 am. When I open her calendar via Outlook Web Access, the same event is scheduled for 4:30 am. I don't understand Exchange well-enough to imagine how this is possible. If you have any ideas why this could be happening, I greatly appreciate it. I'd also very much appreciate any insight you have to how this could be possible. There must be some cached data on the front-end server that causes the calendar entry to appear at a different time, I suppose. Any insight into how Exchange manages that cache and where I could look for an issue would be very helpful. Thank you!

    Read the article

  • Copying large Windows directory structure to new server with permissions intact

    - by Chris
    I'm soon going to be doing a large migration of an old-school web server that serves mostly ASP pages (currently on a Windows 2003 server) to a newer, virtualized Windows 2008 server. This new server is going to be in a different domain, as well. So I'm copying the root web folder, and all its subfolders and files, to this new server. I'd like to keep permissions intact. It's also pretty massive - I'd like to be able to compress it before transferring to the new server with permissions intact. Any way to do that? And will the new server being in a different AD domain screw with my plans?

    Read the article

  • building a home server with a nas appliance [closed]

    - by user51666
    Possible Duplicate: Best way to build home NAS with redundancy I was hoping to get some ideas from folks here. I'm interested in building a home web server with a nas appliance. It would be primarily used for storing pictures, video. I want a networked storage device so I can have multiple devices access it wirelessly as needed from within our home and also I want the option to access from outside the house using a login/pw access. I'm also interested in customizing, building my own web pages as well. Preferably apache. Any preferences? Does anyone have an interesting, neat set up they can share? Thanks!

    Read the article

  • Running Tomcat 7 and Apache 2 on the same server

    - by Thorn
    Part of my site needs to run over HTTPS and I'm creating a sub-domain for that part. I have apache httpd 2 AND Tomcat 7 running on the same server with the same IP, Apache is on port 80 of course, while Tomcat is running on port 8080. Right now I am doing domain forwarding for requests that need to run off tomcat. For example, mathteamhosting.com/mathApp can forward to mathteamhosting.com:8080/mathApp. I would like to have Tomcat handle the https requests for that subdomain. I don't think this forwarding technique can work in this case. How do I set that up so that Tomcat receives the requests on port 443 while apache handles port 80. To be more specific: http://proctinator.com == request goes to Apache web server https://private.proctinator.com == request goes to Apache web server

    Read the article

  • What is the proper way to set up the Apache document root in terms of privileges?

    - by racl101
    I have just installed Ubuntu 9.10 server edition on my machine and I wish to run my own personal local server with other users in the same LAN. First, I was wondering what folder directory structure is best for the web root? Should I just use: /var/www/ and start throwing web documents there or should I create a folder elsewhere (maybe the home directory)? Second, in the /var/www/ directory only the root user can create documents in there, however, I wish to have other users be able to create files in the document root and upload them via FTP. Should I change the permissions or the www/ folder? Or again, should I create the document root elsewhere with different permissions? What is the safest way of doing this?

    Read the article

  • Tomcat and HTTPS connect timeout (local Proxy resolves it)

    - by smas
    I have web application on the Tomcat with webservices. I've noticed that all web services connected to https get timeout. I run this app on my localhost in my company. When I redirect all my connections through Fiddler (local proxy) everything works correctly. I don't want to execute fiddler all the time. my computer -> [FIDDLER local proxy] -> [remote proxy] // WORKS my computer -> [remote proxy] // timeout How to increase tomcat logging to get more technical logs than only "timeout". Is there any other way to get more information what blocks the https URL?

    Read the article

  • Manipulating Exchange 2003 shared contacts folder remotely

    - by andybak
    I've got a CRM web app running on a remote server that needs to synchronise it's contacts with the in-house Exchange 2003 shared contacts. Exchange 2003 doesn't appear to support web services. What would the typical approach to this problem be? My initial instinct would be to open port 80 on the Exchange server, run a simple webserver, POST to it and then control Exchange via OLE automation scripting (if that's what people are still calling it!) but there might be a better solution I'm not aware of. Any suggestions?

    Read the article

  • How does Amazon ec2-user get its sudo rights

    - by Johan
    I am looking for where the default Amazon AMI linux image sets up the privileges for the default ec2-user account. After logging in with this account I can use sudo successfully. Checking via the sudoers file, which I open by running visudo (with no other options) I see a few default settings and permissions for root ALL ALL So ... Where is the permissions for ec2-user assigned? I have not yet tried to add a new permission but ultimately I want to resign ec2-user for systems management tasks and use a non-full root user for administering the applications (stop and start mysql, httpd, edit apache's vhost files, and upload / edit web content under the web root)

    Read the article

  • How to give write permissions to multiple users?

    - by Daniel Rikowski
    I have a web server and I'm uploading files using an FTP client. Because of that the owner and the group of the file are taken from the user used during the upload. Now I have to make this file writable by the web server (apache/apache). One way would be to just change the owner and the group of the uploaded file to apache/apache, but that way I cannot modify the file using the FTP account. Another way would be to give the file 777 permissions. Both approaches seem not very professional and a little bit risky. Are there any other options? In Windows I can just add another user to the file. Can something similar done with Linux?

    Read the article

  • ASPX code too run query

    - by Akoori
    I have web.config like below : </appSettings> <authentication mode="Windows" /> <authorization> <allow users="*" /> <!-- Allow all users --> </authorization> <trace enabled="false" requestLimit="10" pageOutput="false" traceMode="SortByTime" localOnly="true" /> <sessionState mode="InProc" stateConnectionString="tcpip=127.0.0.1:42424" sqlConnectionString="data source=127.0.0.1;Trusted_Connection=yes" ****ieless="false" timeout="30" /> <globalization requestEncoding="utf-8" responseEncoding="utf-8" / I need an aspx code to run query with this connection string that there in this web.config Regards

    Read the article

  • Directing multiple domains to one server

    - by dtechie
    Hi I would like to host 5 word-press blogs on my unlimited server space i bought from a web-hosting company similar to hostgator. Now the companies say it is technically not possible to direct more than one domain to the hosting package and i would need to buy multiple hosting packages . I read about vhost (apache),domain alias,ip forwarding But not sure how ask my webhost about it when they say it is not possible here is what i want to do xhost.com/folder1 << www.x.com xhost.com/folder2 << www.y.com xhost.com/folder3 << www.z.com xhost.com/folder4 << www.aa.com the web host has a cpanel so dont know if they give/have access to vhosts if they are a reseller thank you for your help

    Read the article

  • Access server bound to localhost:5000 from different computer

    - by Jesse
    I am working on a web application using the Pylons framework. The web server is binding to localhost:5000 so I am able to access my application by going to localhost:5000 in my browser. I would like to be able to access the server from another computer on the same network. The computer that is hosting the server and application is running Mac OSX and the computer I would like to be able to access the application is running Windows 7 (I have cygwin with SSH installed as well as PuTTY). I could work around this by binding to the host name of the computer but would rather leave it running only on localhost. I was thinking I could do something with SSH tunneling but have not had any luck so far. Any ideas?

    Read the article

  • Virtualbox PUEL Interpretation

    - by modernzombie
    Sorry if this seems like a lame question but I want to be sure before making a decision. The Virtualbox PUEL license says “Personal Use” requires that you use the Product on the same Host Computer where you installed it yourself and that no more than one client connect to that Host Computer at a time for the purpose of displaying Guest Computers remotely. I take this to mean that if I want to setup a development server (web server) that's only used by me to do my work this falls under personal use. But if I make this server available for clients to connect to the websites to view my progress this is no longer personal use also meaning that using Vbox to run a production web server is also against the license. Again sorry if this is a dumb question but I find it hard to follow the wording used in licenses. I know I could go with OSE but I have not looked into VNC versus RDP yet. Thanks.

    Read the article

  • Apache deny access to images folder, but still able to display via <img> on site

    - by jeffery_the_wind
    I have an images folder on my site, let's call it /images/ where I keep a lot of images. I don't want anyone to have direct access to the images via the web, so I put a new directive in my Apache config that achieves this: <Directory "/var/www/images/"> Options Includes AllowOverride All Order allow,deny Deny from All </Directory> This is working, but it is blocking out ALL ACCESS, and I can't show the images anymore through my web pages. I guess this makes sense. So how do I selectively control access to these images? Basically I only want to display certain images through certain webpages and to certain users. What is best way to do this? Do I need to save the images to the database? Tim

    Read the article

  • Ubuntu+Mono+Postgres+ASP.NET 4.0. No problem?

    - by wreck_of_u
    Would this be ok? I'm an ASP.NET developer and I'm planning to build "portable" web app servers based on Atom D510 mini-ITX. I have ran Ubuntu 10 with MySQL along with a separate IIS machines (win 2k3, 2k8) before with no problems. But now I'm thinking of "packaging" a web/db server into one small, cheap machine. I thought of Ubuntu/Mono/Postgres/ASP.NET, that it would be a good idea but I'm not sure? I have not actually tried it yet. Your thoughts?

    Read the article

  • Local dns for testing websites using mobile devices

    - by Morpheu5
    Hi. I have no idea where to start from so sorry in advance if this topic has already been discussed. I usually develop web sites using my laptop as a development server, and recently I needed to test a web site using various mobile devices that can connect via wifi. Having no real AP, I set up a ad-hoc network using my laptop's wireless card and the devices can correctly browse the Internet and access the laptop's web server. The setup is as follows: subnet: 192.168.1.0/24 gateway to the Internet (wired adsl router/modem): 192.168.1.1 laptop: 192.168.1.64 (eth0, wired if connected to the gateway) and 192.168.1.32 (eth1, wifi if somewhat bridged to eth0) mobile devices (same for all, I only use one of them at any time for simplicity): 192.168.1.11 with default gw 192.168.1.1 Now, if I open either 192.168.1.32 or 192.168.1.64 from the mobile devices, I correctly get the default host of my Apache configuration. However I usually work with virtual hosts for many practical reasons, one of which being Drupal's peculiar implementation of multi-sites. For those who don't know how this works, Drupal takes the request's hostname and searches into its sites/ subdirectories for an appropriate configuration file. So, for example, suppose I request www.example.com, then Drupal would search for a config file in the following directories: sites/www.example.com/ sites/example.com/ sites/com/ sites/default/ So I decided to adopt the following style of virtual hosts: if the website I'm working on will be accessible using www.example.com I set up a sites/www.example.com/ directory and create a virtual host for local.www.example.com so Drupal have no trouble finding it. I've been told this is suboptimal from a dns point of view since I'd have to create an authoritative entry for example.com and turn Bind on only when I'm supposed to access the local copy, which is weird. However, if this is the only path I can follow, I still have some problems with Bind's configuration, as I couldn't find any guide that tells me in a clear, noob-friendly way, how to set up such an entry. On the other hand, I was wondering if I could set up an authoritative entry for local, so I could access www.example.com.local and tell in some way (which I don't even know if this is possible) Apache to put www.example.com instead of www.example.com.local in the relevant environment variable. Anyway, I have a last problem, sort of: when I launch Bind in debug mode with high verbosity, and make 192.168.1.32 as the primary dns for the devices, the output doesn't say anything about requests being made from the devices to Bind, so I'm not even sure it comes into play. As you can see, I'm a complete noob at these matters, but I'm eager to learn, so any help/pointer will be appreciated.

    Read the article

  • Linux periodically "losing" ability to connect to server via SSH?

    - by gct
    I know this isn't exactly a programming question, but it popped up in my use of git for programming projects at least. I've got a web server that I use to host my git repos on, but my ubuntu box seems to "lose" the ability to connect to it via SSH. I'll get a "connection refused" error when I try to ssh or use git. Rebooting my local machine will fix the problem, but only temporarily. I can still connect to the web interface just fine, and the problem manifests with other servers as well. I've been working around it by pulling my changes over to my laptop and pushing from there, but that's sub-optimal as you can imagine. Has anyone seen something like this? I'd be tempted to say it's some kind of IP caching problem, but I can't connect even using the IP address of the server directly... Running Ubuntu 9.04

    Read the article

  • What else can I do to secure my Linux server?

    - by eric01
    I want to put a web application on my Linux server: I will first explain to you what the web app will do and then I will tell you what I did so far to secure my brand new Linux system. The app will be a classified ads website (like gumtree.co.uk) where users can sell their items, upload images, send to and receive emails from the admin. It will use SSL for some pages. I will need SSH. So far, what I did to secure my stock Ubuntu (latest version) is the following: NOTE: I probably did some things that will prevent the application from doing all its tasks, so please let me know of that. My machine's sole purpose will be hosting the website. (I put numbers as bullet points so you can refer to them more easily) 1) Firewall I installed Uncomplicated Firewall. Deny IN & OUT by default Rules: Allow IN & OUT: HTTP, IMAP, POP3, SMTP, SSH, UDP port 53 (DNS), UDP port 123 (SNTP), SSL, port 443 (the ones I didn't allow were FTP, NFS, Samba, VNC, CUPS) When I install MySQL & Apache, I will open up Port 3306 IN & OUT. 2) Secure the partition in /etc/fstab, I added the following line at the end: tmpfs /dev/shm tmpfs defaults,rw 0 0 Then in console: mount -o remount /dev/shm 3) Secure the kernel In the file /etc/sysctl.conf, there are a few different filters to uncomment. I didn't know which one was relevant to web app hosting. Which one should I activate? They are the following: A) Turn on Source Address Verification in all interfaces to prevent spoofing attacks B) Uncomment the next line to enable packet forwarding for IPv4 C) Uncomment the next line to enable packet forwarding for IPv6 D) Do no accept ICMP redirects (we are not a router) E) Accept ICMP redirects only for gateways listed in our default gateway list F) Do not send ICMP redirects G) Do not accept IP source route packets (we are not a router) H) Log Martian Packets 4) Configure the passwd file Replace "sh" by "false" for all accounts except user account and root. I also did it for the account called sshd. I am not sure whether it will prevent SSH connection (which I want to use) or if it's something else. 5) Configure the shadow file In the console: passwd -l to lock all accounts except user account. 6) Install rkhunter and chkrootkit 7) Install Bum Disabled those services: "High performance mail server", "unreadable (kerneloops)","unreadable (speech-dispatcher)","Restores DNS" (should this one stay on?) 8) Install Apparmor_profiles 9) Install clamav & freshclam (antivirus and update) What did I do wrong and what should I do more to secure this Linux machine? Thanks a lot in advance

    Read the article

  • forward all ports via htaccess to new address

    - by user875933
    I have a chat server running on my local machine that listens to different ports. I want to use the sub-domain of one of my accounts to access it. I intend to manually change the redirect whenever my local machine gets a different ip address. So: chat.example.com:123 would redirect to dynamic.ip.address:123 I am trying to accomplish this with .htaccess and RewriteRule I have tried: RewriteEngine on RewriteRule ^(.*) http://dynamic.ip.address/ [L, R=302] but this doesn't work. When I try chat.example.com:123 nothing happens. When I input chat.example.com into the web browser, I get dynamic.ip.address Is .htaccess the right tool for this? I am using a simple web host that gives me ssh access, but not much more.

    Read the article

  • Route a specific user's traffic via VPN but still allow local networking

    - by wbg
    So, I want to route certain traffic via a VPN connection and the rest via my normal Internet connection. I want to run several different programs and most of them don't support binding to a specific network interface (tun0 in my case). I've managed to send a specific user's traffic via the VPN following the answers given here: iptables - Target to route packet to specific interface? But unfortunately, when I run a server that connects to the Internet and has a web interface running on a local IP (127.0.0.1/192.168.0.*), all the Internet traffic correctly goes via tun0, but I'm unable to connect to the web interface from a local IP as a different user. When I log in as the VPN-ified user, I can access services running on local IPs, but other users/machines can't access any servers I start. Can anyone point me in the right direction?

    Read the article

  • Securing internal data accessed by a website on the big, bad internet

    - by aehiilrs
    A close relative of this question on Stack Overflow: When you have a web site in your DMZ that needs to access production data stored on an internal DB, what strategies do you recommend using to lower the risks that come from accessing live data? Is it even considered acceptable to have a connection initiated from the DMZ come inside of your network? An extra detail about the nature of the site that kind of throws a monkey wrench into the machinery is that people using the web site will be competing for "spots" on a first-come, first-serve basis with others using the internal software. Because of this, as close to zero lag time between the two applications as possible is ideal.

    Read the article

  • How to show users the reason for a message being bounced or rejected by Postfix?

    - by Ross Bearman
    A user would like to be able to view a web page showing any emails that a Postfix server has either been unable to send, or unable to receive. For example if the user was supposed to receive an email from a third party but it hasn't arrived, they'd be able to check the web page and see a list of emails rejected by Postfix, along with a clear reason as to why. I've been unable to find an existing application that offers this functionality. Does anyone know of any, or is the best way forward to write a script that parses the log and display the results?

    Read the article

  • Can't access server from external IP

    - by Mathias
    I have a problem with my web server; I can't access it from the external IP address. I'm using an IIS 7 server, but I've tried with apache on Linux as well. I have forwarded all traffic on port 80 to my computer, but it just won't work. I've done port forwarding with my Minecraft server, and it did work, but when I try it with a web server, no. I've been looking on many many forums, but their methods don't work for me. My router is a Speedport W 723V, if anyone knows that one. Any help is appreciated.

    Read the article

  • IIS doesn't respond to 127.0.0.1 (external IP works fine)

    - by Jordan
    I have an AWS web server - call it box.company.com. It's running IIS and if I visit http://box.company.com in a web browser (from any machine, including box.company.com), it responds correctly with our site. However, if I visit localhost/ or 127.0.0.1/ when I'm logged into box.company.com, I get a "couldn't connect to host" message. The hosts file has only one entry - the standard "127.0.0.1 localhost" line. Pinging 127.0.0.1 works fine. Pinging localhost correctly resolves to 127.0.0.1 and works fine. I've tried restarting IIS and restarting the DNS Cache. I had this problem once before, and restarting the server fixed it, but I'd like to know what's going on in case this happens again in the future.

    Read the article

  • Roll standalone JBoss app under Tomcat

    - by Seva Alekseyev
    I've got a Linux box where there’s Tomcat running, with some JSP applications in it. Now, I’ve received a third party app from a developer shop to be eventually deployed. It came as an archive called "jboss7.tar" which, it seems, contained a whole standalone Web server. Once I’ve followed their instructions and run the designated shell script, it would start a server that would listen on port 8081, and app pages are being served up. Still, this strikes me as an inelegant setup. Why run two Web servers side by side, both of them Java-enabled? Also, the manual startup of the standalone app, I don't like that either. The real question is – can I take the user-provided portions from the said archive and somehow plug it under the existing Tomcat instance? It looks like the user code is packaged into files with .war extension, I can see them under /var/jboss7/standalone/deployments.

    Read the article

< Previous Page | 877 878 879 880 881 882 883 884 885 886 887 888  | Next Page >