Search Results

Search found 229 results on 10 pages for 'khalid saif'.

Page 7/10 | < Previous Page | 3 4 5 6 7 8 9 10  | Next Page >

  • Tips on creating user interfaces and optimizing the user experience

    - by Saif Bechan
    I am currently working on a project where a lot of user interaction is going to take place. There is also a commercial side as people can buy certain items and services. In my opinion a good blend of user interface, speed and security is essential for these types of websites. It is fairly easy to use ajax and JavaScript nowadays to do almost everything, as there are a lot of libraries available such as jQuery and others. But this can have some performance and incompatibility issues. This can lead to users just going to the next website. The overall look of the website is important too. Where to place certain buttons, where to place certain types of articles such as faq and support. Where and how to display error messages so that the user sees them but are not bothering him. And an overall color scheme is important too. The basic question is: How to create an interface that triggers a user to buy/use your services I know psychology also plays a huge role in how users interact with your website. The color scheme for example is important. When the colors are irritating on a website you just want to click away. I have not found any articles that explain those concept. Does anyone have any tips and/or recourses where i can get some articles that guide you in making the correct choices for your website.

    Read the article

  • SQL last day in moth select

    - by Saif Khan
    I have the following table which is basically the summary of daily transactions and is loaded nightly. +++++++++++++++++++++++ + DateCreated + Sale + +++++++++++++++++++++++ + 20100101 + 1000 + + 20100131 + 2000 + + 20100210 + 2000 + + 20100331 + 4000 + +++++++++++++++++++++++ I need to display the sale by month, but only for the last day in each month. eg JAN 2000 FEB 0 MAR 4000 I could probably accomplish this with CASE in my select, but is this efficient? This is SQL Server 2000.

    Read the article

  • Stream data with Node.js

    - by Saif Bechan
    I want to know if it is possible to stream data from the server to the client with Node.js. From what I can understand all over the internet is that this has to be possible, yet I fail to find a correct example or solution. What I want is a single http post to node.js with AJAX. Than leave the connection open and continuously stream data to the client. The client will receive this stream and update the page continuously.

    Read the article

  • What am I missing with log4net - No log file created

    - by Saif Khan
    I am trying to use log4net in a VB.NET app for some unknown reason it's not creating the log file. Here is my app.config <?xml version="1.0" encoding="utf-8" ?> <configuration> <configSections> <section name="log4net" type="log4net.Config.Log4NetConfigurationSectionHandler, log4net" /> </configSections> <log4net> <appender name="FileAppender" type="log4net.Appender.FileAppender"> <file value="c:\log-file.txt" /> <appendToFile value="true" /> <layout type="log4net.Layout.PatternLayout"> <conversionPattern value="%date [%thread] %-5level %logger [%property{NDC}] - %message%newline" /> </layout> </appender> <root> <level value="ALL" /> <appender-ref ref="FileAppender" /> </root> </log4net> </configuration> Here is the app code Imports log4net Public Class Form1 Dim log As ILog Private Sub Button1_Click(ByVal sender As System.Object, ByVal e As System.EventArgs) Handles Button1.Click log.Error("test") End Sub Private Sub Form1_Load(ByVal sender As Object, ByVal e As System.EventArgs) Handles Me.Load log4net.Config.XmlConfigurator.Configure() log = log4net.LogManager.GetLogger("TestThings") End Sub End Class "TestThings" is the name of the VS project. What am I missing?

    Read the article

  • Performance problem with System.Net.Mail

    - by Saif Khan
    I have this unusual problem with mailing from my app. At first it wasn't working (getting unable to relay error crap) anyways I added the proper authentication and it works. My problem now is, if I try to send around 300 emails (each with a 500k attachment) the app starts hanging around 95% thru the process. Here is some of my code which is called for each mail to be sent Using mail As New MailMessage() With mail .From = New MailAddress(My.Resources.EmailFrom) For Each contact As Contact In Contacts .To.Add(contact.Email) Next .Subject = "Accounting" .Body = My.Resources.EmailBody 'Back the stream up to the beginning orelse the attachment 'will be sent as a zero (0) byte file. attachment.Seek(0, SeekOrigin.Begin) .Attachments.Add(New Attachment(attachment, String.Concat(Item.Year, Item.AttachmentType.Extension))) End With Dim smtp As New SmtpClient("192.168.1.2") With smtp .DeliveryMethod = SmtpDeliveryMethod.Network .UseDefaultCredentials = False .Credentials = New NetworkCredential("username", "password") .Send(mail) End With End Using With item .SentStatus = True .DateSent = DateTime.Now.Date .Save() End With Return I was thinking, can I just prepare all the mails and add them to a collection then open one SMTP conenction and just iterate the collection, calling the send like this Using mail As New MailMessage() ... MailCollection.Add(mail) End Using ... Dim smtp As New SmtpClient("192.168.1.2") With smtp .DeliveryMethod = SmtpDeliveryMethod.Network .UseDefaultCredentials = False .Credentials = New NetworkCredential("username", "password") For Each mail in MainCollection .Send(mail) Next End With

    Read the article

  • connect to web API from ..NET

    - by Saif Khan
    How can I access and consume a web API from .NET? The API is not a .NET API. Here is sample code I have in Ruby require 'uri' require 'net/http' url = URI.parse("http://account.codebasehq.com/widgets/tickets") req = Net::HTTP::Post.new(url.path) req.basic_auth('dave', '6b2579a03c2e8825a5fd0a9b4390d15571f3674d') req.add_field('Content-type', 'application/xml') req.add_field('Accept', 'application/xml') xml = "<ticket><summary>My Example Ticket</summary><status-id>1234</status-id><priority-id>1234</priority-id><ticket-type>bug</ticket-type></ticket>" res = Net::HTTP.new(url.host, url.port).start {|http| http.request(req, xml)} case res when Net::HTTPCreated puts "Record was created successfully." else puts "An error occurred while adding this record" end Where can I find information on consuming API like this from .NET? I am aware how to use .NET webservices.

    Read the article

  • Changing html <-> ajax <-> php/mysql to threaded approach

    - by Saif Bechan
    I have an application that needs to be updated real-time. There are various counters and other information that have to come from the database and the system needs to be up to date for the user. My approach now is just a normal ajax request every second to get the new values from the database. There is a JavaScript which loops every second getting the values trough ajax. This works fine but I think its very inefficient. The problem There is an ajax script that loops every second requesting data from php # On the server it has to load the PHP interpeter The PHP file has to get the data and format it correctly # PHP has to make a connection with the mysql database Work with the database(reads,never writes) Format the data so it can be send Send the data back to the browser # Close the database connection, and close the php interpeter Last the browser has to read these values and update the various html parts Now with this approach it has to load the interpreter and make a db connection every second. I was thinking of a way to make this more efficient, and maybe use a threaded approach to this. Threaded aprouch Do a post to the PHP when you enter the page and keep the connection alive In PHP only load the interpreter once, and make a connection to the DB ones Every second send an ajax response to the javascript listener The javascript listener than just changes values as the response from php arrives. I think this approach will be a great optimization to the server load and overall performance. But I can spot some weak point in the system and i need some help with these. Problems with the approach PHP execution time limit I don't think PHP is designed for such a setup. I know there is a time limit on php script execution. I don't know if an everlasting loop in PHP will cause any serious cpu/memory problems. Sending ajax request without breaking I don't know if it is possible to have just one ajax post action and have open and accepting data. user exists the page What will happen when the user exists the page and the PHP script is still going. Will it go on forever. security issues so far i can't think of any security issues. Almost every setup you use have some security issues. Maybe there are some with this solution I do not know of. Open to other solution I really want to change the setup as it is now and move to a threaded approach or better. If someone has a better approach to tackle this I definitely want to hear that. Maybe the usage of some other scripts is better suited for having an ongoing runtime. I only know php and java so any suggestions are welcome and I am willing to dig trough. I know there are things like perl, python etcetera that are used for this type of threaded but i don't know which one is best suited. When using other script If the best way is to go with other type of script like perl,python etcetera I do have some critera. The script has to be accessible via ajax post If it accepts some kind of json encode/decode it would be nice The script has to be able to access the session file This is essential because I need to know if the user is logged in The script has to be able to easily talk to MySQL All comments are welcome, and I hope this question is helpful to other also. Cheers!

    Read the article

  • How to benchmark apache/nginx setup

    - by Saif Bechan
    I am planning to setup nginx as reverse proxy. I will have apache to deliver my dynamic content, and nginx will deliver the static content. My configuration i have now is just Apache with fastCGI. This gives me no configuration problems and runs great. After I have set up nginx I want to run some benchmarks to see if I really got some performance increases, else i will switch back. Does anyone know how I can benchmark this type of setup? Or maybe someone did this already and have some canned results, I will be glad to hear them. PS. I know this is more a serverfault type of question, but i have seen numerous posts about apache and nginx so i thought i give it a try

    Read the article

  • [PHP] Difference between normal and magic setters and getters

    - by Saif Bechan
    I am using a magic getter/setter class for my session variables, but I don't see any difference between normal setters and getters. The code: class session { public function __set($name, $value) { $_SESSION[$name] = $value; } public function __unset($name) { unset($_SESSION[$name]); } public function __get($name) { if(isset($_SESSION[$name])) { return $_SESSION[$name]; } } } Now the first thing I noticed is that I have to call $session->_unset('var_name') to remove the variable, nothing 'magical' about that. Secondly when I try to use $session->some_var this does not work. I can only get the session variable using $_SESSION['some_var']. I have looked at the PHP manual but the functions look the same as mine. Am I doing something wrong, or is there not really anything magic about these functions.

    Read the article

  • Calendar Table - Week number of month

    - by Saif Khan
    I have a calendar table with data from year 2000 to 2012 (2012 wasn't intentional!). I just realize that I don't have the week number of month (e.g In January 1,2,3,4 February 1,2,3,4) How do I go about calculating the week numbers in a month to fill this table? Here is the table schema CREATE TABLE [TCalendar] ( [TimeKey] [int] NOT NULL , [FullDateAlternateKey] [datetime] NOT NULL , [HolidayKey] [tinyint] NULL , [IsWeekDay] [tinyint] NULL , [DayNumberOfWeek] [tinyint] NULL , [EnglishDayNameOfWeek] [nvarchar] (10) COLLATE SQL_Latin1_General_CP1_CI_AS NULL , [SpanishDayNameOfWeek] [nvarchar] (10) COLLATE SQL_Latin1_General_CP1_CI_AS NULL , [FrenchDayNameOfWeek] [nvarchar] (10) COLLATE SQL_Latin1_General_CP1_CI_AS NULL , [DayNumberOfMonth] [tinyint] NULL , [DayNumberOfYear] [smallint] NULL , [WeekNumberOfYear] [tinyint] NULL , [EnglishMonthName] [nvarchar] (10) COLLATE SQL_Latin1_General_CP1_CI_AS NULL , [SpanishMonthName] [nvarchar] (10) COLLATE SQL_Latin1_General_CP1_CI_AS NULL , [FrenchMonthName] [nvarchar] (10) COLLATE SQL_Latin1_General_CP1_CI_AS NULL , [MonthNumberOfYear] [tinyint] NULL , [CalendarQuarter] [tinyint] NULL , [CalendarYear] [char] (4) COLLATE SQL_Latin1_General_CP1_CI_AS NULL , [CalendarSemester] [tinyint] NULL , [FiscalQuarter] [tinyint] NULL , [FiscalYear] [char] (4) COLLATE SQL_Latin1_General_CP1_CI_AS NULL , [FiscalSemester] [tinyint] NULL , [IsLastDayInMonth] [tinyint] NULL , CONSTRAINT [PK_TCalendar] PRIMARY KEY CLUSTERED ( [TimeKey] ) ON [PRIMARY] ) ON [PRIMARY] GO

    Read the article

  • mod_php / mod_suphp / FastCGI | Which do you recommend and why.

    - by Saif Bechan
    I am at the point that I have to choose on what type of setup my application should run. I know there are some types available where apache runs smooth on, but they all have there downsides. System: Apache 2 / PHP 5.2 I hope you can give me some tips from firsthand experience. To give you an example of what to be covered. - Performance - Ease of setup - Security I know this does not really involve programming, but I have seen post concerning this and I know that you guys/girls here are certainly qualified to comment on this subject.

    Read the article

  • How to protect copyright on large web applications?

    - by Saif Bechan
    Recently I have read about Myows, described as "the universal copyright management and protection app for smart creatives". It is used to protect copyright and more. Currently I am working on a large web application which is in late testing phase. Because of the complexity of the app there are not many versions of it online so copyright will be a huge issue for me as much of the code is in JavaScript and is easy copyable. I was glad to see that there is some company out there that provide such services, and naturally I wanted to know if there were people using it. I did not know that this type of concept was so new. Is protecting copyright a good idea for a large web application? If so, do you think Myows will be worth using, or are there better ways to achieve that? Update: Wow, there is no better person to have answered this question like the creator himself. There were some nice points made in the answer, and I think it will be a good service for people like me. In the next couple of weeks I will be looking further into the subject and start uploading my code and see how it works out. I will leave this question up because I do want some more suggestions on this topic.

    Read the article

  • How to give you website customers a secure feeling

    - by Saif Bechan
    I was wondering how you can give your website customers the confidence that you are not tinkering with the database values. I am planning on running a website which falls in the realm of an online game. There is some kind of credit system involved that people have to pay for. Now I was wondering how sites like this ensure there customers that there is no foul play in the database itself. As I am the database admin i can pretty much change all the values from within without anyone knowing i did. Hence letting someone win that does not rightfully is the winner. Is it maybe an option to decrypt en encrypt the credits people have so i can't change them. Or is there maybe a company i can hire that checks my company for foul play.

    Read the article

  • Difference between normal and magic setters and getters

    - by Saif Bechan
    I am using a magic getter/setter class for my session variables, but I don't see any difference between normal setters and getters. The code: class session { public function __set($name, $value) { $_SESSION[$name] = $value; } public function __unset($name) { unset($_SESSION[$name]); } public function __get($name) { if(isset($_SESSION[$name])) { return $_SESSION[$name]; } } } Now the first thing I noticed is that I have to call $session->_unset('var_name') to remove the variable, nothing 'magical' about that. Secondly when I try to use $session->some_var this does not work. I can only get the session variable using $_SESSION['some_var']. I have looked at the PHP manual but the functions look the same as mine. Am I doing something wrong, or is there not really anything magic about these functions.

    Read the article

  • MySQL query cache and PHP variables

    - by Saif Bechan
    I have seen the following statement made about the query cache: // query cache does NOT work $r = mysql_query("SELECT username FROM user WHERE signup_date >= CURDATE()"); // query cache works! $today = date("Y-m-d"); $r = mysql_query("SELECT username FROM user WHERE signup_date >= '$today'"); So the query cache only works on the second query. I was wondering if the query cache will also work on this: define('__TODAY',date("Y-m-d")); $r = mysql_query("SELECT username FROM user WHERE signup_date >= '".__TODAY."'");

    Read the article

  • How to create program that can be run by: #service myservice start

    - by Saif Bechan
    I am new to Linux and want to know what kind of programs can be run by using. #service myservice start And the programs stays on until stopped. Are they normal c++ programs or are they different. In some tutorials i have seen that they use ./myprogram to start a program. Another thing I have seen is the usage of .sh files. One last type of program i see is executed by the command: #/usr/bin/myprogramm Can someone explain the difference between these or point me to a basic tutorial/guide.

    Read the article

  • Working local and upload on save in Eclipse with Aptana

    - by Saif Bechan
    Currently I am working in Dreamweaver for my web project. The way I work is open local files, and when I save the files are uploaded via FTP to my server. This works great because I have the same files local and on the server, and I don't have to move the files around a lot. Is this possible in eclipse with Aptana? I have tried working with RSE(remote system explorer), but then I only work with the remote files. I have also tried the normal Aptana FTP features, but there is no 'save on upload' feature. Does anyone have some recommendations for me?

    Read the article

  • Checking if language is set in url with regex

    - by Saif Bechan
    I have am working on a multi language file. My urls look something like this: http://www.mydomain.com/en/about/info http://www.mydomain.com/nl/about/info Now I use a small regex script that redirect the user when they use a link without language. The script looks like this: preg_match('~^/[a-z]{2}/~', $_SERVER['REQUEST_URI'] This finds out is there is a language set en|nl|de etc. This works fine on all links except for these: http://www.mydomain.com/en http://www.mydomain.com/nl There is no trailing slash so the regex can not find the given values. Anyone know a fix for this?

    Read the article

  • Run a PHP script every second using CLI

    - by Saif Bechan
    Hello, I have a dedicated server running Cent OS with a Parallel PLESK panel. I need to run a php script every second, that updates my database. These is no alternative way timewise, i have checked every method, it needs to be updated every second. I can find my script using the url: http://www.mysite.com/phpfile.php?key=123, and this has to be executed every second. Does anyone have any knowledge at all on doing this, i can not seem to find the answer. I heard about doing it with CLI and putty, but i have no knowledge of this at all. Or can this be done using the PLESK Panel? And can the file be executed locally every second. Like \phpfile.php If someone helps me on answering these question i would really appreciate it. Regards EDIT It has been a few months since i added this question. I ended up using the following code: #!/user/bin/php $start = microtime(true); set_time_limit(60); for (i = 0; i < 59; ++$i) { doMyThings(); time_sleep_until($start + $i + 1); } Thank you for this code guys! My cronjob is set to every minute. I have been running this for some time now in a test environment, and this works out great. It works really supperfast, and i see no increase in CPU nor Memory usage.

    Read the article

  • Choice of primary index for mysql innoDB

    - by Saif Bechan
    I have an auction website where users can place a bid on a product. Now i have a primary index on the bid table for easy access of the last places bid on the product. This index is just a unique auto incrementing value. During the week this number becomes huge!! I was wondering if this is a good setup for the primary key in an innoDB table. The bids table exist of the following important fields: table: bids fields: user_id,product_id,bid So what i want to do is make the primary of these 3 fields combined. Is this a good idea or is this just too much for innoDB keys.

    Read the article

< Previous Page | 3 4 5 6 7 8 9 10  | Next Page >