Search Results

Search found 22893 results on 916 pages for 'client scripting'.

Page 643/916 | < Previous Page | 639 640 641 642 643 644 645 646 647 648 649 650  | Next Page >

  • Emails intended as HTML are received as plain text

    - by Jeremy
    I'm regularly receiving emails from a well-known public website that read as plain text without carriage breaks or effective hyperlinks. My email client is Thunderbird. Thunderbird helpsite doesn't display an answer. And I'm reluctant to complain to the website if the problem is at my end. Message source for headers includes this: Content-Type: multipart/alternative; boundary=--boundary_9338_03b8c925-816e-4b55-95c4-b2593da7e5f6 The content in message source that follows the header is preceded by this: ----boundary_9338_03b8c925-816e-4b55-95c4-b2593da7e5f6 Content-Type: text/html; charset=utf-8 Content-Transfer-Encoding: base64 The content itself in message source reads typically like this: PCFkb2N0eXBlIGh0bWwgcHVibGljICItLy9XM0MvL0RURCBIVE1MIDQuMCBUcmFuc2l0aW9u YWwvL0VOIj4NCg0KDQo8aHRtbD4NCjxoZWFkPg0KPG1ldGEgaHR0cC1lcXVpdj0iQ29udGVu, etc.,etc. And, as I've said, the message in the viewing pane is unadulterated plain text. Can you tell me - where is it all going wrong? Thanks.

    Read the article

  • API Class with intensive network requests

    - by Marco Acierno
    I'm working an API which works as "intermediary" between a REST API and the developer. In this way, when the programmer do something like this: User user = client.getUser(nickname); it will execute a network request to download from the service the data about the user and then the programmer can use the data by doing things like user.getLocation(); user.getDisplayName(); and so on. Now there are some methods like getFollowers() which execute another network request and i could do it in two ways: Download all the data in the getUser method (and not only the most important) but in this way the request time could be very long since it should execute the request to various urls Download the data when the user calls the method, it looks like the best way and to improve it i could cache the result so the next call to getFollowers returns immediately with the data already download instead of execute again the request. What is the best way? And i should let methods like getUser and getFollowers stop the code execution until the data is ready or i should implement a callback so when the data is ready the callback gets fired? (this looks like Javascript)

    Read the article

  • Changing LDAP schema casts Confluence AD integration unoperable

    - by Maxim V. Pavlov
    I have had our instance of Atlassian Confluence configured to be integrated with our Active Directory. In AD, all the users were being created under default Users folder in Active Directory Users and Computers. We have decided to introduce cleaner separation and have created an Organizational Units structure in AD. Under root we have created Managed OU, and under it - Users OU and all user accounts were moved under Users OU. Now I though that to let the Confluence AD integration engine "know" where to look for user accounts now, I only need to adjust the BaseDN and prepand it with ou=Managed so it is aware that it is looking for cn=Users but under ou=Managed. That didn't work. How should I adjust LDAP schema root in a client application for it to be able to look for users in OU that then in a default folder.

    Read the article

  • Weird Windows 2003 MSDTC and SQL 2005 issue

    - by seagull surfer
    scenario: Windows 2003 sp2 x64 enterprise edition. SQL 2005 sp2 cu9 x64 Enterprise edition After restarting the resource groups on two node active-active cluster, 3 SQL 2005 instances start up fine. The 4th one starts up but starts throwing the following error. "Enlist operation failed: 0x8004d00e(XACT E NOTRANSACTION). SQL Server could not register with Microsoft Distributed Transaction Coordinator (MS DTC) as a resource manager for this transaction. The transaction may have been stopped by the client or the resource manager." MSDTC is fine since the other 3 function normally. The only way to "fix" it is to take the 4th instance offline and bring it online again. Is there any way to fix this enlistment without restarting?

    Read the article

  • How to configure to URLs for One Server using wildcard supported certificates?

    - by Amit
    Hi, We have wildcard supported certificate installed in our production environment. One of our client wants his name to appear in the URL (e.g. companyname.sitename.net). How we should facilitate this? Do we need to make any entries for this in DNS? If yes can you please let me know about it? I need to set this up before Fridat PST, any help in this is highly appriciated. Thanks.

    Read the article

  • Remote Desktop via VPN from Mac to Windows Vista

    - by Vegar
    I have some problems connecting to my office from home. I'm getting the following error message: Remote Desktop Connection cannot verify the identity of the computer that you want to connect to. Try reconnecting to the Windows-based computer, or contact our administrator. I have downloaded CoRD, and for some reason, that works okay. I can also connect from a Windows 7 running on VMWare Fusion. On Windows 7, I use SonicWall Global VPN Client, and on the Mac, I use VPN tracker, if that is related... What's going on?

    Read the article

  • Validate that a Checkbox is checked using javascript

    - by H(at)Ni
    I was facing a challenge yesterday that I was creating a Visual webpart and I wanted to validate the a submit button is only visible if the user checked a "I agree to terms" checkbox. Something was weired that I tested my code on a normal asp.net website and it worked perfectly while it had a different behaviour inside the webpart which is whenever I check the checkbox, the button is enabled but it will not fire the asp.net validators in client side. It posts back the page and then the validators appear after that. So, I tried to change my type of thinking and I reached a different solution is that to call a javascript function whenever the button is clicked and then check if the checkbox is clicked or not. To illustrate more, here are an example to what I'm saying: 1. Button in aspx page: <asp:Button OnClientClick="CheckForCondition();"  ValidationGroup="CompaniesSection" ID="btnCompaniesSubmit"                         runat="server" Text="Submit" /> 2. CheckForCondition() function: <script language="javascript" type="text/javascript">                         function CheckForCondition() {                             if ($jq('#<%= ChkCompanyCheck.ClientID %>:checked').val() == undefined) {                                 $jq('#lblCheckBox').show();                                 return false;                             }                             else {                                 $jq('#lblCheckBox').hide();                                 return true;                             }                         }                      </script> 3. lblCheckBox is simply a label that shows a red asterisk beside the checkbox to indicate that it's a required field. <label id="lblCheckBox" style="color:Red;display:none">*</label>

    Read the article

  • Apache DVB http video Streaming bandwidth or priority problem

    - by igino manfre'
    I'm streaming few precompressed DVB videos from cloud. The streams are generated from VLC on "impossible" ports (such as 64085, 64086 etc) reverse proxed by Apache on port 80 and 8080. All the generated streams are listed in "http://95.110.164.61/indexv.html". From an ADSL connection with enough downlink bandwidth, recalling the stream generated by VLC (such as "http://95.110.164.61:64087/mpg2_6.4") it flows fluently. Recalling the same stream proxed by Apache ("http://95.110.164.61/mpg2_6.4") the stream stops and goes. The only situation in which the Apache proxed streams flow regularly is from a site connected through 64 Mbps warranted bandwith with RTT to the server less than 10 mseconds. Please note that streams below 2 Mbps are fluently proxed. The system is a single core xeon with windows 2008 R2 on 4 GB of RAM with 1 Gbps of network bandwidth. The drain of computational and bandwidth resources is negligeable, the RAM usage always lower than 50%. On the system I run many VLC streamers. Any of them drains a variable amount of RAM (from about 25 to 70 MB). On the contrary the couple of httpd.exe processes drain no more than 7 MB. Using Wireshark (on the server) I see that VLC directy send to the client much more packets than Apache, and the stream is framgmented on many frames. I'm not a programmer, a newby of Apache. Can anyone please address me to a specific portion of the Apache's huge documentation? Thank you. igino

    Read the article

  • Do I have to chmod 777 my NFS folder when I share?

    - by luckytaxi
    Under Redhat, if I export a folder as an NFS mount, does the folder have to have RW for users/groups/others? Right now /storage/software is -rwxr-xr-x root/root i.e. /etc/exportfs /storage/software *(rw,sync) On my client, I can mount but I can't write. I'm using a regular user and NOT root. I think "no_root_squash" fixes it but I really don't want that. Then again, nor do I want to have to chmod 777 the folder on the server.

    Read the article

  • How to mount remote sambe from local host with multiple groups ?

    - by Dragos
    I am using mount.cifs to mount a remote samba share (both client and server are Ubuntu server 8.04) like this: mount.cifs //sambaserver/samba /mountpath -o credentials=/path/.credentials,uid=someuser,gid=1000 `$ cat .credentials username=user password=password I mounted a user from local system with username and password with mount.cifs but the problem is that the user is part of multiple groups on the remote system and with mount.cifs I can only specify one gid. Is there a way to specify all the gids that the remote user has ? Is there a way to: 1) Mount the remote samba with multiple groups on the local system ? 2) Browse the mount from 1) with the terminal since I want to pass some files from samba as arguments to local programs. Other solutions would be: nautilus sftp:// which runs through gvfs but the newer gnome does not write to disk the ~/.gvfs anymore so I can't browse it in terminal. An the last solution would be nfs but that means that I have to synchronize the uids and gids on the local system with the ones from the server.

    Read the article

  • Sending large files - do any vendors sell their solution?

    - by Rob Nicholson
    We currently have an account with www.mailbigfile.com to allow us to send & receive files which exceed our client's email limits. In our industry, a 10MB limit is not unknown. Mailbigfile works fine for what it is but increasingly, our clients are starting to block it as a security risk. A solution would be for us to license the software and run it from our own web server which is far less likely to be blocked. Does anyone know of vendors in this market? We are looking at web collaboration systems but that's a much bigger project. The technology behind www.mailbigfile.com isn't that complex (http upload, email notification and then http download) so I'm hoping it won't be very expensive. Cheers, Rob.

    Read the article

  • How to block a user in apache httpd server from accessing a *.php file inside a Directory, instead user should access this using Directory name

    - by Oxi
    My requirement looks Simple, But Googling Did not help me yet. my query is i want to Throw a 404 page to a user(Not Re-Direct to another folder or file), who is trying to Access *.php files in my website ex: when a client asks for www.example.com/home/ i want to show the content , but when user says www.example.com/home/index.php i want to show a 404 page. i tried different methods, nothing worked for me, one of which tried is shown below <Directory "C:/xampp/htdocs/*"> <FilesMatch "^\.php"> Order Deny,Allow Deny from all ErrorDocument 403 /test/404/ ErrorDocument 404 /test/404/ </FilesMatch> </Directory> Thanks in Advance

    Read the article

  • Is it necessary to add an HP printer to CUPS using the hplip URI?

    - by JPbuntu
    I recently setup a CUPS print server (Ubuntu server 12.04) and I having trouble with performance of a HP Color LaserJet Printer CP3505n. The printer pauses for about a second between printing each page, which is annoying when there is a lot of printing to be done. This doesn't happen when the printer is installed directly to a Windows client. In an attempt to fix this I have setup the printer a couple different ways. I decided not to do a Samba share since this wiki said IPP is preferred. First Method Added to HP LaserJet to CUPS as a Discovered Network Printer, and selected HP Color LaserJet cp3505 hpijs pcl3, 3.12.2 (en) driver. I did not use a hplip URI. Second Method (hplip URI) I thought adding hplip to the mix might improve the performance, so I added the printer like this: Ran hp-setup -m 192.168.2.60, prompted to select driver Selected HP Color LaserJet cp3505 hpijs pcl3, 3.12.2 (en) Used hplip to generate a URI: hp-makeuri 192.168.2.60 Then added the printer to CUPS as a Local Printer: HP Printer (HPLIP), and entered: hp:/net/HP_Color_LaserJet_CP3505?ip=192.168.2.60. Either method I use I am able to share the printer on the network by adding a printer as http://192.168.2.2:631/printers/HP_LASER-TERRAC. Does it make a difference which way the printer is added cups? If so, and I install the printer with the hp URI, can I still change the driver using the CUPS web interface? I have been trying out different drivers to try and improve performance, and the cups interface is the easiest way to change them. Thanks in advance.

    Read the article

  • How to tunnel port 25565 through SSH?

    - by user62389
    I want to play a game which is hosted on port 25565 (minecraft!), but my university firewall does not allow this port. I have a dedicated server running linux not too far from uni, so I think there's a way to tunnel through it (but I've never done this before and have no knowledge/experience of tunnelling) It would probably be slow, but it's better than not being able to play at all. Is it possible to do using only SSH, or do I need other client/server software? My server has OpenSSH installed. Also, the computer I'm using to play the game is running Ubuntu. I've tried searching but there seem to be so many different solutions to different types of problem =/

    Read the article

  • My D-Link Router is only allowing one connection

    - by Blaze
    My Router (Model: DI-624) is only allowing one wireless connection to one laptop. The other laptop is stuck hanging at connecting to the Internet I have the SSID set as "Pedro-Home" and is using a WPA PSK secured password. I have set the router using "Blaze-PC" while wired. Both Laptops critically need the Internet. > Dynamic DHCP Client List > > Host Name IP Address MAC > Address Expired Time > Blaze-PC 192.168.0.100 70-f1-a1-ff-39-a8 Apr/21/2011 17:49:14 > pedro 192.168.0.105 00-26-82-c8-47-25 Apr/21/2011 17:50:05 <<This computer isn't connecting.

    Read the article

  • What are the hard and fast rules for Cache Control?

    - by Metalshark
    Confession: sites I maintain have different rules for Cache Control mostly based on the default configuration of the server followed up with recommendations from the Page Speed & Y-Slow Firefox plug-ins and the Network Resources view in Google's Speed Tracer. Cache-Control is set to private/public depending on what they say to do, ETag's/Last-Modified headers are only tinkered with if Y-Slow suggests there is something wrong and Vary-Accept-Encoding seems necessary when manually gziping files for Amazon CloudFront. When reading through the material on the different options and what they do there seems to be conflicting information, rules for broken proxies and cargo cult configurations. Any of the official information provided by the analysis tools mentioned above is quite inaccessible as it deals with each topic individually instead of as a unified strategy (so there is no cross-referencing of techniques). For example, it seems to make no sense that the speed analysis tools rate a site with ETag's the same as a site without them if they are meant to help with caching. What are the hard and fast rules for a platform agnostic Cache Control strategy? EDIT: A link through Jeff Atwood's article explains Caching in superb depth. For the record though here are the hard and fast rules: If the file is Compressed using GZIP, etc - use "cache-control: private" as a proxy may return the compressed version to a client that does not support it (the browser cache will hold files marked this way though). Also remember to include a "Vary: Accept-Encoding" to say that it is compressible. Use Last-Modified in conjunction with ETag - belt and braces usage provides both validators, whilst ETag is based on file contents instead of modification time alone, using both covers all bases. NOTE: AOL's PageTest has a carte blanche approach against ETags for some reason. If you are using Apache on more than one server to host the same content then remove the implicitly declared inode from ETags by excluding it from the FileETag directive (i.e. "FileETag MTime Size") unless you are genuinely using the same live filesystem. Use "cache-control: public" wherever you can - this means that proxy servers (and the browser cache) will return your content even if the rest of the page needs HTTP authentication, etc.

    Read the article

  • Home Media Streaming Server

    - by sharjeel
    I'm looking for a Home Media Server which can stream videos to various devices including iPhone, iPad, Windows Machines, Android etc. I wouldn't want to install any client on any device; a web interface will be preferred. Also, it should be able to read videos of as many formats as possible and should stream by transcoding the resolution and format according to device's specifications. I have a fairly powerful machine on which I'm free to install Windows 7 or Linux. The streaming will be done over a g/n WiFi router or Ethernet locally. What is the suggested solution in my case?

    Read the article

  • Recovering mail from hardrive (broken os).

    - by Maciek Sawicki
    Hi, Friend of mine broke his Windows (doesn't boot). I will probably have to reinstall OS. I want to mount his hard drive in my OS to beck up some important files. He sad he uses Outlook express, but he Use Vista so I'm not sure it's Vistas Mail Client or Outlook. What files I have to copy to backup mail and adders book from Outlook, Outlook Express and Windows Mail? Is it any good program for doing it automatically (for Linux or Windows)? If there are tools or special procedures for other mail clients (for example Thunderbird) you can also name them to make this question more generic.

    Read the article

  • Imap server woes with Android Gingerbread email and Thunderbird

    - by Mojo
    I run my own mail server and use UW's imapd/popd daemons to provide service. This week I just upgraded my OG Droid to a new Droid 3, running Android 2.3.4 (Gingerbread). The email client is much improved over the previous one. But now I have a bad interaction when I try to access email using imap from Thunderbird on a laptop or desktop. Frequently Thunderbird will stop receiving any email at all, and it will appear only on the Droid. Sometimes a Thunderbird restart will make the mail appear, but none of my "deletes" will be recorded, so when I start Thunderbird again, all my old email reappears. If I kill all of the open imap daemons and restart xinetd, I can force it to behave for maybe a session. I've tried turning off IDLE service (push email) on both sides, to no apparent avail. I've also tried installing DroidMail with the same result.

    Read the article

  • How to be hired as a remote programmer abroad and not to be an entrepreneur?

    - by user592704
    The question is quite interesting as for me because I watched jobs adds and mostly they all: A) Require being located the same country a vacancy is B) Employers don't want to hire foreign programmers if they don't have H1B or something C) As a rule, most adds provide 6 month contract position I can keep adding the list for long time describing some job adds specifications, anyway, as a rule, most positions require non-employee cooperation status. I don't have a company for such kind of "making projects by a client order" so it is quite complicated; So I was trying, just, as for a statistics, to find out is there a way to be hired abroad as a remote programmer as if I get hired in my native city? The thing is not about being hired where I can be hired "because I am located this or that place" but the thing is about a possibility (not to relocate) which actually should provide nowadays technologies especially for IT specialists in many different fields; So the question is it possible to work any country in remote mode as if I am working in my own place? What do I need for that? Can you advice some useful web sites in this direction? If you can share your own experience I'd love to listen to. Any useful advices are much appreciated

    Read the article

  • How to backup data on debian vps to dropbox?

    - by IBr
    I have really simple private VPS with some webpages and music server. I want to backup some configs and some scripts to dropbox or similar service. Server has no gui (except simple ssh X forwarding, which is neither convenient for constant usage and does not provide full desktop) everything is controlled through ssh. So my question would is it possible to setup dropbox client for command line use? How? Is there any alternatives for dropbox, which would have command line clients? Also is it possible to incorporate backup into script for cron job?

    Read the article

  • Maximum MTU size

    - by user192702
    Thought one of the issues I'm experiencing with the following question is due to MTU rightfully so. ESXi 5 VM Putty session hangs, vSphere client timing out However, when I tried testing the maximum MTU size it seems there's just no limit. Thought Ethernet only allows maximum MTU. But I'm up to 54450. ping -l 54450 192.168.10.7 Pinging 192.168.50.7 with 54450 bytes of data: Reply from 192.168.10.7: bytes=54450 time=1081ms TTL=62 Reply from 192.168.10.7: bytes=54450 time=1079ms TTL=62 Reply from 192.168.10.7: bytes=54450 time=1079ms TTL=62 Reply from 192.168.10.7: bytes=54450 time=1079ms TTL=62 Ping statistics for 192.168.10.7: Packets: Sent = 4, Received = 4, Lost = 0 (0% loss), Approximate round trip times in milli-seconds: Minimum = 1079ms, Maximum = 1081ms, Average = 1079ms

    Read the article

  • Microsoft CA certificate templates expires sooner than expected

    - by Tim Brigham
    The certificates my Microsoft CA is generating do not match the time period indicated in the template used. How can I resolve this? I recently created a new certificate template for use on my Linux boxes on my Microsoft CA (2008 R2 Enterprise). This template is approved for server and client authentication purposes with a validity period of 10 years - the expected lifetime of our Linux boxes - and the subject name supplied in the request. I have checked both the intermediate and offline CA - both have more than 10 years of life listed. Is there some kind of hard limit I'm hitting here?

    Read the article

  • How to mount remote samba share from local host with multiple groups?

    - by Dragos
    I am using mount.cifs to mount a remote samba share (both client and server are Ubuntu server 8.04) like this: mount.cifs //sambaserver/samba /mountpath -o credentials=/path/.credentials,uid=someuser,gid=1000 $ cat .credentials username=user password=password I mounted a user from local system with username and password with mount.cifs but the problem is that the user is part of multiple groups on the remote system and with mount.cifs I can only specify one gid. Is there a way to specify all the gids that the remote user has? Is there a way to: Mount the remote samba with multiple groups on the local system? Browse the mount from 1) with the terminal since I want to pass some files from samba as arguments to local programs. Other solutions would be: nautilus sftp:// which runs through gvfs; but the newer gnome does not write to disk the ~/.gvfs anymore so I can't browse it in terminal. And the last solution would be NFS but that means that I have to synchronize the uids and gids on the local system with the ones from the server.

    Read the article

  • invitation: Oracle Endeca Information Discovery Bootcamp

    - by mseika
    The Oracle Endeca Information Discovery (OEID) Boot Camp is designed to give partners an understanding of OEID’s features, and how it complements the existing Oracle Business Intelligence suite. Participants will learn how to develop & implement solutions using a Data Discovery method. Training is in EnglishWhat will be covered?The Oracle Endeca Information Discovery (OEID) Boot Camp is a three-day class with a combination of lecture and hands-on exercises, tailored to make participants aware of the Oracle Endeca Information Discovery platform, and to gain valuable skills for the implementation of projects.The course will follow a combination of lectures and hands-on lab sessions, to allow participants to apply the knowledge they have gained by extracting from sample data sources, and creating an end-user application that will be used to answer several business questions. What You Will Learn Architecture: OEID Components, use of graphs, overview of clustering OEID Installation: Architecture planning, infrastructure requirements, installation process, production hints & tips OEID Administration: Data store management, administrative operations, portal configuration, data sources, system monitoring Indexing: Integration Suite, Data source analysis, Graph (ETL) creation, record design techniques Portlets: Studio portlets, custom portlet development, querying functions Reporting: Studio applications & best practices, visualizations, EQL PrerequisitesYou must bring a laptop with you for the Hands-on labs ENVIRONMENT – LAPTOP REQUIREMENTS For the OEID boot camp, participants will perform the hands-on lab exercises using a virtual machine image. These virtual machines will be provided to participants within a cloud environment, requiring participants to bring a laptop to the Boot Camp that can access a Windows server utilizing Microsoft RDP from their laptop. Participants will not need to install any software onto their laptops, but must ensure that they have the proper software installed for their OS, to connect through RDP to a server. HARDWARE • CPU: Dual-core, x64, 1.8Ghz or higher • RAM: 2GB SOFTWARE • Microsoft Remote Desktop Client • Internet Explorer 7, Firefox, or Google Chrome This boot camp is intended for prospective implementers of Oracle Endeca Information Discovery (OEID), or those in a presales role looking to gain insight into the technical benefits of this new package. Attendees should have experience and familiarity with the basic concepts of business intelligence. Where and When ? Monday, October 15th until wednesday, October 17th included 9:00 - 18:00 Oracle France 15, boulevard Charles de Gaulle 92715 Colombes Access Register Here Limited number of seats !

    Read the article

< Previous Page | 639 640 641 642 643 644 645 646 647 648 649 650  | Next Page >