Search Results

Search found 30217 results on 1209 pages for 'website performance'.

Page 161/1209 | < Previous Page | 157 158 159 160 161 162 163 164 165 166 167 168  | Next Page >

  • iPhone SSL Website Certificate Warning

    - by Sonny
    I have a few sites that have SSL Certificates installed. When an SSL request is made with my employer's iPhone, this error message is displayed: Accept Website Certificate The certificate for this website is invalid. Tap Accept to connect to this website anyway. I've pulled up the same pages in other browsers, including Safari, and they do not show any issues with the certs. These two URLs exhibit the problem: https://www.powerlunchbunch.com/index.php?template=join&nav=20 https://www.councilonagingmartin.org/index.php?template=donate&nav=257 Additional Information: Both SSL certs are issued by Network Solutions

    Read the article

  • How to use google translate to translate website automatically using geoip

    - by AK
    I have been looking around the internet for a script which would use google translate api to translate a website automatically through a geoip script without the need of clicking translate button. Since google does provide a small div snippet which you can add to your website and then through a drop down menu you can choose the language and click translate and it translates the whole website. the snippet is here http://translate.google.com/translate_tools?hl=en&layout=1&eotf=1&sl=ru&tl=en How can i integrate a geoip script along with the above snippet or there are also a couple of google translate scripts available on the internet.

    Read the article

  • How to force two process to run on the same CPU?

    - by kovan
    Context: I'm programming a software system that consists of multiple processes. It is programmed in C++ under Linux. and they communicate among them using Linux shared memory. Usually, in software development, is in the final stage when the performance optimization is made. Here I came to a big problem. The software has high performance requirements, but in machines with 4 or 8 CPU cores (usually with more than one CPU), it was only able to use 3 cores, thus wasting 25% of the CPU power in the first ones, and more than 60% in the second ones. After many research, and having discarded mutex and lock contention, I found out that the time was being wasted on shmdt/shmat calls (detach and attach to shared memory segments). After some more research, I found out that these CPUs, which usually are AMD Opteron and Intel Xeon, use a memory system called NUMA, which basically means that each processor has its fast, "local memory", and accessing memory from other CPUs is expensive. After doing some tests, the problem seems to be that the software is designed so that, basically, any process can pass shared memory segments to any other process, and to any thread in them. This seems to kill performance, as process are constantly accessing memory from other processes. Question: Now, the question is, is there any way to force pairs of processes to execute in the same CPU?. I don't mean to force them to execute always in the same processor, as I don't care in which one they are executed, altough that would do the job. Ideally, there would be a way to tell the kernel: If you schedule this process in one processor, you must also schedule this "brother" process (which is the process with which it communicates through shared memory) in that same processor, so that performance is not penalized.

    Read the article

  • Developing Mobile App (Iphone,Blackberry,Android) for an existing website

    - by soodos
    Hey Guys I have a website and am planning to develop a mobile version of it for the iphone, blackberry and android. My website is a social network built on PHP Zend framework. Now all these mobile apps are going to be having the same functionality like the website. I am little ignorant about this - but from a high level I understand that all these mobile apps should not have to write any backend logic. For every functionality - they will simply make a web service API call to interact with the backend. So does that mean, for every functionality I need to create a web service method. Can the existing code be re used - I'm a little lost - Can someone shed some light on this matter or point me in the right direction (like some articles) Thanks

    Read the article

  • Issues with absolute paths for js files in website

    - by Vinni
    Hello guys, I have a website which has so many sub folders in it. I have following paths references to my js and css files. <link rel="stylesheet" type="text/css" href="css/styles.css" /> <script type="text/javascript" src="js/jquery.js"></script> the above code is working fine on my local machine. the JS file is not loading when i host the website into production server. Problem in my hosting server is my website is ponted to www.somewebsitename.com instead www.somewebsitename.com/home.aspx . When I load the page with www.somewebsitename.com/home.aspx this url it is loading all the js files it is not loading files only when I load the page with www.somewebsitename.com. Please solve my problem. How to reference JS files so that they ll loaded how ever u visit the page.

    Read the article

  • Implementing website messaging services

    - by Alex
    There's a website which currently has only web interfaces for all the users. However, new requirement has emerged which says that there is a need in providing an alternative interface for a certain group of website users who want to regularly receive pushed notifications(messages) that would inform them about new events(news) in their field of interest. What sort of interface would it be?Is it all about using a MOM(message-oriented middleware) like JMS and deploying it withing an application server on a website using publish/subscribe model?Is it the correct way how to meet these sort of requirements?And if it's not a web-based interface,would kind of interface would it be then for the users? Thanks

    Read the article

  • Is selling a "website screen scraper" is illegal?

    - by Yatendra Goel
    I have coded a "website screen scraper" and want to sell it commercially. I know that webpages scraped by the screen scraper are restricted to be scraped by the webmaser of that website. The robots.txt file of the website says that its webpages must not be scraped. So my question is whether selling that screen scraper is a crime or using that screen scraper is a crime in legal terms. I know that this question is related to law but I thought the software experts on SO must also have answer to this question.

    Read the article

  • How to impelement messaging services in a website?

    - by Alex
    There's a website which currently has only web interfaces for all the users. However, new requirement has emerged which says that there is a need in providing an alternative interface for a certain group of website users who want to regularly receive pushed notifications(messages) that would inform them about new events(news) in their field of interest. What sort of interface would it be?Is it all about using a MOM(message-oriented middleware) like JMS and deploying it withing an application server on a website using publish/subscribe model?Is it the correct way how to meet these sort of requirements?And if it's not a web-based interface,would kind of interface would it be then for the users? Thanks

    Read the article

  • Twitter - Get users live tweets on my website instantly

    - by Gublooo
    Hello guys, I was checking out the stocktwits.com website. While signing up I provided them with my twitter username. Now whenever I tweet and if my tweet contains $ and a stock ticker symbol - it instantly appears on Stocktwits.com I am interested in implementing something similar in my website. Just wanted an understanding of how this would work. This is how I'm assuming it would work at a high level: 1) On my website - require users to provide their twitter usernames 2) Run a cron job every few minutes to pull the latest tweets for each of the usernames provided using something like http://twitter.com/statuses/user_timeline/username.xml I've tried Stocktwits and they updates are almost instantaneous - so it does not appear like they are checking for updates every few minutes. What are the best ways to implement this solution Thanks

    Read the article

  • Windows secure pinned website tile

    - by Stijn de Voogd
    I'm currently working on a pinned website tile for my website and instead of using a static XML file i'm linking the tile to a web api that returns user specific XML. My question is: Is it possible to secure this tile so that a user needs to be logged in before the data loads? The pinned website livetile doesn't send any security request headers/ cookies: - Http: Request, GET /v1/livetile/firsttile Command: GET + URI: /v1/livetile/firsttile ProtocolVersion: HTTP/1.1 UserAgent: Microsoft-WNS/6.3 Host: 192.168.14.109:2089 Cache-Control: no-cache HeaderEnd: CRLF Sidenote: Notice how it's not even sending an accept header even though it only wants xml. Info: http://msdn.microsoft.com/en-US/library/ie/dn455106 http://msdn.microsoft.com/en-us/library/ie/hh761491.aspx# Thanks in advance!

    Read the article

  • Stack trace in website project, when debug = false

    - by chandmk
    We have a website project. We are logging unhanded exceptions via a appdomain level exception handler. When we set debug= true in web.config, the exception log is showing the offending line numbers in the stack trace. But when we set debug = false, in web.config, log is not displaying the line numbers. We are not in a position to convert the website project in to webapplication project type at this time. Its legacy application and almost all the code is in aspx pages. We also need to leave the project in 'updatable' mode. i.e. We can't user pre-compile option. We are generating pdb files. Is there anyway to tell this kind of website projects to generate the pdb files, and show the line numbers in the stack trace?

    Read the article

  • SharePoint : https area in a public website

    - by Hugo Migneron
    I'm working on a public website that was built using SharePoint (WSS). We need to add an area in the site where people will be able to purchase items with their credit cards and obviously the area needs to be secured. The website is using Form Based Authentication and the users need to stay logged in when they are moved back and forth from the https zone. I know how to enable SSL for a new web application / site collection but this isn't really an option for me as the website is already online and we don't want the whole thing to be secured. I am comfortable with the development of the webparts involved (payment module, shopping cart, etc.) but I can't really figure out how to create only certain https pages when the site collection is created. Can you have features that deploy pages that are secured? If so, how? Can you have a zone where SSL is enabled but where the users are redirected to and from without losing their authentication (FBA)? Thanks!

    Read the article

  • Collaborative localization website supporting Android strings.xml?

    - by Nicolas Raoul
    My open source Android application has internationalization done the Android way, with strings.xml files. The community has many people from many countries, and they are willing to contribute/improve translations using a collaborative website. There is Launchpad but it only supports the gettext format so we would have to use scripts, not very convenient. There is Crowdin but somehow this website seems dead, nearly no projects, and the download links do not work. Actually we started using Crowdin but all download links fail to give any strings.xml file back, see here. What website is convenient for translating open source Android applications?

    Read the article

  • Is selling a "website screen scraper" illegal?

    - by Yatendra Goel
    I have coded a "website screen scraper" and want to sell it commercially. I know that webpages scraped by the screen scraper are restricted to be scraped by the webmaser of that website. The robots.txt file of the website says that its webpages must not be scraped. So my question is whether selling that screen scraper is a crime or using that screen scraper is a crime in legal terms. I know that this question is related to law but I thought the software experts on SO must also have answer to this question.

    Read the article

  • Regular expression only for website

    - by Katie
    HI, I'm new to Regular Expression. I need to find just website in some text and I'm looking for a regular expression able to find out strings like: www.my.home, http://my.site.it But this regular expression should not find strings like: [email protected] or if the website is already inside html tag <a href="http://www.my.site.com/"><span style="font-style: normal;">www.mambo-test.org</span></a> I tried with this one: \b((https?://[^ ])|(www.[^ ])) but it also finds the website in the href and between the tag: <a href="http://www.my.site.com/"><span style="font-style: normal;">www.mambo-test.org</span></a> and I don't know how except this case.

    Read the article

  • Facebook doesn't work on my website

    - by Wilker Augusto
    So, I use to have the facebook like button and box working normaly but now, I have created a new blogue and a new facebook app and the facebook just doens't work in it. Problem : in like box or the normal like button it ask for me to confirm the like but after confirmation, nothing happend. Can anybody please explain me how to make it work on my website ? I tried to follow instruction about open graphs and alots of things that I did not use last time on a website, and nothing worked for me . So, can anybody explain me please step by step what I need to do to make the facebook aplication work on my website ? Please and thanks :) p.s: sorry for my english

    Read the article

  • Alternative or succesor to GDBM

    - by Anon Guy
    We a have a GDBM key-value database as the backend to a load-balanced web-facing application that is in implemented in C++. The data served by the application has grown very large, so our admins have moved the GDBM files from "local" storage (on the webservers, or very close by) to a large, shared, remote, NFS-mounted filesystem. This has affected performance. Our performance tests (in a test environment) show page load times jumping from hundreds of milliseconds (for local disk) to several seconds (over NFS, local network), and sometimes getting as high as 30 seconds. I believe a large part of the problem is that the application makes lots of random reads from the GDBM files, and that these are slow over NFS, and this will be even worse in production (where the front-end and back-end have even more network hardware between them) and as our database gets even bigger. While this is not a critical application, I would like to improve performance, and have some resources available, including the application developer time and Unix admins. My main constraint is time only have the resources for a few weeks. As I see it, my options are: Improve NFS performance by tuning parameters. My instinct is we wont get much out of this, but I have been wrong before, and I don't really know very much about NFS tuning. Move to a different key-value database, such as memcachedb or Tokyo Cabinet. Replace NFS with some other protocol (iSCSI has been mentioned, but i am not familiar with it). How should I approach this problem?

    Read the article

  • Implementing Google Search Appliance results into website

    - by Adam Jenkin
    I’m interested to hear peoples preferred methods or approaches to implementing the search results from a Google Search Appliance into an existing website. More specifically how do people prefer to implement/embed the search results into their existing site and persist the surrounding website elements (menus, membership etc) around the search results. As far as I am aware there are 3 different approaches. Sub-domain, handle everything in the xslt – create a search.mysite.com which is completely handled by google xslt and embed surround site components in xslt. Embed search results into existing site using an iframe – Use the existing website and just use an iframe to import results into page. Embed results into existing site by using server side processing – This is how I have previously integrated search into a site using a combination of bespoke dev and the GSALib project. I would be interested to hear if anyone has other suggestions, and were people have benefited or regretted using the above approaches.

    Read the article

  • Adding a red5 app in a multiuser website

    - by Zakaria
    hi everybody, I have an mvc php website where users can publish their public information: http://www.example.com/foobar/profile. Beside this project, based on some red5 samples, I have an application (done with Flex) that sends audio: rtmp://server/sendAudio (very basic but works). I want to create for each subscribed on my website an admin part where can send an audio stream: http://admin.example.com/foobar. And, when someone goes on their public profile, they can listen to the streamed audio: http://www.example.com/foobar/profile). How can I use my red5/flash app dynamically with my php website so that my users can broadcast their proper canal? Do you have some experience to share ? Thank you, Regards.

    Read the article

  • Communicating from website to desktop application (not vice-versa)

    - by chakrit
    I am designing an application such that it will have to react to certain actions required by the website, mostly the same way Last.FM client does (if you have used one) The way Last.FM client works is that, it will register a custom protocol in Windows (lastfm://) and on their website, they use that protocol to trigger certain action on the player, e.g.: lastfm://artist//similarartists will actually direct any Last.FM client running to load up similar artists. I would like to do the same thing for my application. Is this a good idea? Is there any good alternatives to send a message from a website to a desktop client in this manner?

    Read the article

  • Refactoring or Rewriting Monolithic PHP Spaghetti Codebase

    - by nategood
    I've inherited a really poorly designed PHP spaghetti code project. It's been gaining a good bit of traffic recently and is starting to have performance issues on top of the poor monolithic code base. Its maxing out performance on a chunky 16GB dedicated machine when it really shouldn't be. I'm planning on doing some performance tweaks right off the bat to help the performance issue, but this still won't really help the horrible code base. The team is small but expecting to grow very soon. I've read Joel's article on the troubles of doing a complete rewrite and see the concerns. But how bad does the code base have to be before you consider a rewrite? There is PHP handling logic interjected into what one would usually consider a "view". Even worse, in some places SQL statements are in these same files! The only real separation of presentation and logic are a few PHP scripts that serve as function libraries. These scripts do most of the ORM stuff... if you can even call it that. Trying to slowly refractor this seems like a nightmare. Open to your thoughts and opinions... however not interested in hearing, "Run away, Run away!".

    Read the article

  • Client / Server security from mobile to website

    - by Amir Latif
    Hey. Am new to the world of web programming and learning a bunch of fairly simple new pieces of tech, trying to piece them all together. So, we have a simple client (currently iPhone, to move to J2ME soon) that's pulling down lists of data via PHP, which is talking to a MySQL db. I have a rudimentary user/login system so that data is only served to someone who matches a known user etc, either on the website or on the client. All the php scripts on the website that query the DB check to make sure an active session is in place, otherwise dumping the user back to the login screen. I've read a little about SSL and want to know if that is sufficient to protect the website AND the data passing between the server and the client?

    Read the article

  • ASP.Net WebSite Membership AND C# Windows Form Membership

    - by user1638362
    This is not real, it's just a project i'm working on I've created a Hotel Management system in C# WindowsForm, it allows staff members to Add/Edit/Update Rooms,Reservation and Customers etc. Along side this Windows-form i'm creating an ASP.net WebSite where customers should be able to register and reserve rooms online. I've come to the point where i need to create some-type of membership method for this website which should correspond to the membership of the windows form. However i'm not sure what method of membership would be best suited for this, i have looked into the asp.net membership, it's what i want however it creates it's own schema and i don't know how i can relate the information to my customers table and c#windows form. I would ideally like it to resemble a real-life situation as much as possible anyway, am i going about this the wrong way? in terms of the c# windows-form what other technology would a business use to manage a system like this where they can add/edit/update there system and have a website which relates. What are my options here?

    Read the article

  • Correct structure and way of website versioning

    - by Saif Bechan
    Recently I use GIT to version my website. It makes it all really easy to see how my project develops and I always have save backups on different places on the web. Now my main question is if it is recommended to version your whole root of the website. I have a basic structure that looks something like this: /httpdocs /config /media /application index.php .htaccess 1) Should I use the /httpdocs folder to version, or should I use the content of the folder. 2) Is it recommended to version the media folder. In the media version I have several images for the overall layout, and some other images for the website. These imagas can be quite large. I work on these images from time to time and so they change. I hardly never need the old image again, so is this not just taking up precious storage space. I would highly appreciate just some basic recommendation on this topic.

    Read the article

  • REST client website login

    - by Jordan
    I have written a REST service that uses WSSE as an authentication method but i want to be able to use this rest service through a browser by creating a website around the service. I want the user to be able to log in on the website then when they view, for example the "view users" page an ajax request is made to test.com/users and back comes the list. The part i'm trying to get my head around is the logging in/out on the website and keeping the user logged in across pages. Since in a true REST implementation there's no state held on the server, i can't use $_SESSION and now i don't know where to start! What is the best way to go about this? Do i still need to store session information on the server then possibly use cURL to make the request? Thanks Jay

    Read the article

< Previous Page | 157 158 159 160 161 162 163 164 165 166 167 168  | Next Page >