Search Results

Search found 9960 results on 399 pages for 'iwork pages'.

Page 269/399 | < Previous Page | 265 266 267 268 269 270 271 272 273 274 275 276  | Next Page >

  • Where is a SharePoint Community for Webparts?

    - by elhombre
    Hi all I am searching a SharePoint Server 2007 Webpart which can do following * change password * lost password recovery (e-mail) * change password reminder (e-mail) I have been searching the Internet but somehow there aint as many webparts as for example WordPress Addons. When I am lucky I can find individuals which made an Webpart which matches one of the specs :( The only things which come near my specs are http://www.envisionit.com/Products/Pages/ExtranetModuleforSharePoint.aspx http://userchangepassword.codeplex.com I am wondering where I can find community's which have lots of Webparts to download or sell. Does anybody of you know such community's?

    Read the article

  • Visual Studio web tests: Can a coded webtest be run through the Web Test Editor run view?

    - by Frank Rosario
    Hello, Full disclosure, I'm new to Visual Studio Web Tests and coding for them. I've written a webtest; coded in VB; it runs great. Our QA engineer wants to use this script for performance testing; but he wants the nice GUI that comes when you build a WebTest with the VS WebTest Editor and run it. Is there a way to run a coded webtest through this view? He wants to be able to view each test as it runs to see which pages are having issues, but within the GUI he's used to. Alternatively, I know I could just code something that writes out to a log file; but before I go with that solution; I just wanted to see if this is possible. Any constructive input is greatly appreciated.

    Read the article

  • Rendering a view to a string in ASP.NET MVC 2

    - by Frank Rosario
    We need to render an ActionResult to a string to add pages to our internal search engine index. We settled on this solution to render to string. I've run into a problem with the ExecuteResult call used to process the View. Code Snippet: var oldController = controllerContext.RouteData.Values["controller"]; controllerContext.RouteData.Values["controller"] = typeof(TController).Name.Replace("Controller", ""); viewResult.ExecuteResult(controllerContext); // this line breaks I receive the following error: "Object reference not set to instance of object" error. I've confirmed viewResult is not null, so the exception has to be thrown internally in ExecuteResult. What could we be missing?

    Read the article

  • How do you diagnose a 500 error on Heroku when there is no error message in the logs?

    - by lala
    I have a Rails app on Heroku that is serving 500 errors at random intervals. Web pages will display "Internal server error" in plain text, instead of the usual "We're sorry. Something went wrong." page. When I refresh the page, it works fine. The logs don't show me an error message, just » 14:20:34.107 2013-10-11 12:20:33.763690+00:00 heroku router - - at=info method=HEAD path=/ host=www.mydomain.com fwd="184.73.237.85/ec2-184-73-237-85.compute-1.amazonaws.com" dyno=web.1 connect=1ms service=63ms status=200 bytes=0 » 14:21:03.957 2013-10-11 12:21:03.561867+00:00 heroku router - - at=info method=GET path=/ host=www.mydomain.com fwd="50.112.95.211/ec2-50-112-95-211.us-west-2.compute.amazonaws.com" dyno=web.1 connect=0ms service=1ms status=500 bytes=21 Support has told me to look at request queuing in New Relic, but New Relic only shows a big red mark saying the server is down (even though the site works fine when refreshed). With no error messages, I'm at a loss for how to diagnose this issue.

    Read the article

  • Implementing Read Only view in Winform App

    - by Refracted Paladin
    I have an in house winform app for viewing, editing, and inserting member data. There are about 40 sepertate form pages that they use to manipulate different portions of the data. My question is this; What is the best way of implementing a read only view for a form page? My thoughts were to cycle through the controls setting Enabled = False or leave them be but not allow any data changes(no Save Button etc) unless it is "unlocked". I am curious how others handle this with WinForm apps?

    Read the article

  • F#: Define an abstract class that inherits an infterface, but does not implement it

    - by akaphenom
    I owuld like to define an abstract class that inerhits from UserControl and my own Interface IUriProvider, but doesn't implement it. The goal is to be able to define pages (for silverlight) that implement UserControl but also provide their own Uri's (and then stick them in a list / array and deal with them as a set: type IUriProvider = interface abstract member uriString: String ; abstract member Uri : unit -> System.Uri ; end type UriUserControl() as this = inherit IUriProvider with abstract member uriString: String ; inherit UserControl() Also the Uri in the definition - I would like to implement as a property getter - and am having issues with that as well. this does not compile type IUriProvider = interface abstract member uriString: String with get; end Thank you...

    Read the article

  • MFMailComposerViewController doesn't always display attachments

    - by davbryn
    I'm attaching a few files to an email to export from the application I've written, namely a .pdf and a .png. I create these by rendering some view to a context and creating an image and a pdf. I can validate that the files are created properly (I can confirm this by looking in my apps sandbox from Finder, and also by sending the email. I receive the files correctly.) The problem I'm getting is that larger files don't have a preview generated for them within the MFMailComposerViewController view (I simply get a blue icon with a question mark). Is there a limitation on file sizes that can be attached in order for preview to function correctly? With small files it works as expected, but if I try and attach a pdf with the following properties: Pages: 1 Dimensions: 2414 x 1452 Size: 307 KB the file is generated correctly, but displays the question mark icon. If there is no way around that, can I remove the attachment preview altogether? Many thanks, Bryn

    Read the article

  • Page load problem

    - by AZHAR
    Hi, I am devolping a web application.the problem is that i am using a login control (not a .NET control) which is a part of master page and is acessible from all pages. if user log In from a page the login control updates itself and displlay some statistics of logged In user but the specific page does not reload. (some options on page are visible only to authenticated users, so that after login, page should be reloaded to display such options) after logIn methoed I wrote Reponse.Redirect(Request.Url.AbsoluteUri) after this the browser response the "Page cannot be displayed" It would be of great help to me. Many Thanks, Regards. AZHAR

    Read the article

  • ASP.NET MVC reminds me of old Classic ASP spaghetti code...

    - by EdenMachine
    I just went through some MVC tutorials after checking this site out for a while. Is it just me, or does MVC View pages brinig back HORRIBLE flashbacks of Classic ASP spaghetti code with all the jumping in and out of HTML and ASP.NET with yellow delimiters everywhere making it impossible to read? What ever happened to the importance of code/design separation?? I was really sold on the new technology until the tutorials hit the View page development section. Or am I missing something? (And don't say you can use the template to help because it's jsut moving the spaghetti to another location - sweeps it under the rug - it doesn't fix the problem)

    Read the article

  • HTML - Word Doc Images

    - by Michael
    Okay I have roughly 150 + pages of procedures all written in MS word. The person who wrote the procedures did an excellent job of recording the process how to perform specific tasks. This individual went though and created screen shots a MS word doc. There are approx 2 screen shots per page. So it is roughly 300 images and I do not want to recreate the wheel. Does anyone know a quick way of handling the images? The written portion is pretty straightforward but the images is what I am struggling with. Regards, Mike

    Read the article

  • Modify HTML Content with Squid

    - by user298814
    We have setup our network as per the tutorial here: https://help.ubuntu.com/community/Upside-Down-TernetHowTo. Basically, we have a squid proxy that inverts images for pages that clients request. We're trying to modify the script so that we can edit the contents of the webpage before the webpage is sent to the client. We are not having any luck. I'm wondering if there is something different about .html files that makes this not possible. What is happening is that we do a wget on the URI that is requested, save it locally, modify it and then echo back the new URI. The page that the user gets is the unmodified page and not the one that we just changed.

    Read the article

  • Web service reference location?

    - by Damien Dennehy
    I have a Visual Studio 2008 solution that's currently consisting of three projects: A DataFactory project for Business Logic/Data Access. A Web project consisting of the actual user interface, pages, controls, etc. A Web.Core project consisting of utility classes, etc. The application requires consuming a web service. Normally I'd add the service reference to the Web project, but I'm not sure if this is best practice or not. The following options are open to me: Add the reference to the Web project. Add the reference to the Web.Core project, and create a wrapper method that Web will call to consume the web service. Add a new project called Web.Services, and copy step 2. This project is expected to increase in size so I'm open to any suggestions.

    Read the article

  • How to check if a variable is defined in a Master file in ASP.NET MVC

    - by Mortanis
    I've got a Site.Master file I've created to be my template for the majority of the site, with a navigation. This navigation is dynamically created, based on a recursive Entity (called Page) - Pages with a parentID of 0 are top level, and naturally each child carries it's parent's Id in that field. I've created a quick little HTML Helper that accepts the ID of an Page and generates the nav by doing a foreach on the children that have a parentId matching the passed Id. On the majority of the site, I want the Site.Master to use a parentId of 0, but if I'm on a strongly typed View displaying a Page, I naturally want to use the Id of the page. Is there a way to do such conditional logic in a Site.Master (and, does that violate MVC rules)? "If I'm on a strongly typed Page of /Page/{Id}, use the Id render nav, else use 0"

    Read the article

  • Running Script before/after Variables declared?

    - by Sam
    Hi folks. As i understand javascript .js files are best to put all the way at the bottom of html pages, to speed up loading of rest of page. Advised by Yslow(Yahoo) and Page Speed(google). Now, when in the middle of page some thing RUNS a javascript script, in Internet Explorer, i see a small warning message saying that the element is: Uncaught ReferenceError: SWFObject is not defined When i put my all.js file in the had, the error goes away but page load slows doen. What to do? Actually, i remember it was the same with php variables. If i RUN php but the variable comes later, then it just doesnt work. must define the variable first, for it to run. How to make this workflow better, in case of php scripts? and in case of javsscripts? Thanks!

    Read the article

  • Optimize php-fpm and varnish for a powerfull server

    - by Jim
    My setup is: Intel® Core™ i7-2600 and RAM 16 GB DDR3 RAM varnish+nginx+php-fpm+apc for a not very heavy WordPress blog with W3 Total Cache and CDN My problem is that after 55 hits per second according to blitz.io varnish starts giving out timeouts. CPU usage at this time is hardly 1%. Free memory at all time remains 10GB+. I tried benchmarking php-fpm directly with result of 150hits/s without any timeouts. But after that the CPU usage goes 100% and it stops responding. Can you help me optimize it to handle more? As i understand nginx has nothing to do over here so i dont put its config. php-fpm config listen = /tmp/php5-fpm.sock listen.allowed_clients = 127.0.0.1 user = nginx group = nginx pm = dynamic pm.max_children = 150 pm.start_servers = 7 pm.min_spare_servers = 2 pm.max_spare_servers = 15 pm.max_requests = 500 slowlog = /var/log/php-fpm/www-slow.log php_admin_value[error_log] = /var/log/php-fpm/www-error.log php_admin_flag[log_errors] = on apc extension = apc.so apc.enabled=1 apc.shm_size=512MB apc.num_files_hint=0 apc.user_entries_hint=0 apc.ttl=7200 apc.use_request_time=1 apc.user_ttl=7200 apc.gc_ttl=3600 apc.cache_by_default=1 apc.filters apc.mmap_file_mask=/tmp/apc.XXXXXX apc.file_update_protection=2 apc.enable_cli=0 apc.max_file_size=1M apc.stat=1 apc.stat_ctime=0 apc.canonicalize=0 apc.write_lock=1 apc.report_autofilter=0 apc.rfc1867=0 apc.rfc1867_prefix =upload_ apc.rfc1867_name=APC_UPLOAD_PROGRESS apc.rfc1867_freq=0 apc.rfc1867_ttl=3600 apc.include_once_override=0 apc.lazy_classes=0 apc.lazy_functions=0 apc.coredump_unmap=0 apc.file_md5=0 apc.preload_path Varnish VCL backend default { .host = "127.0.0.1"; .port = "8080"; .connect_timeout = 6s; .first_byte_timeout = 6s; .between_bytes_timeout = 60s; } acl purgehosts { "localhost"; "127.0.0.1"; } # Called after a document has been successfully retrieved from the backend. sub vcl_fetch { # Uncomment to make the default cache "time to live" is 5 minutes, handy # but it may cache stale pages unless purged. (TODO) # By default Varnish will use the headers sent to it by Apache (the backend server) # to figure out the correct TTL. # WP Super Cache sends a TTL of 3 seconds, set in wp-content/cache/.htaccess set beresp.ttl = 24h; # Strip cookies for static files and set a long cache expiry time. if (req.url ~ "\.(jpg|jpeg|gif|png|ico|css|zip|tgz|gz|rar|bz2|pdf|txt|tar|wav|bmp|rtf|js|flv|swf|html|htm)$") { unset beresp.http.set-cookie; set beresp.ttl = 24h; } # If WordPress cookies found then page is not cacheable if (req.http.Cookie ~"(wp-postpass|wordpress_logged_in|comment_author_)") { # set beresp.cacheable = false;#versions less than 3 #beresp.ttl>0 is cacheable so 0 will not be cached set beresp.ttl = 0s; } else { #set beresp.cacheable = true; set beresp.ttl=24h;#cache for 24hrs } # Varnish determined the object was not cacheable #if ttl is not > 0 seconds then it is cachebale if (!beresp.ttl > 0s) { # set beresp.http.X-Cacheable = "NO:Not Cacheable"; } else if ( req.http.Cookie ~"(wp-postpass|wordpress_logged_in|comment_author_)" ) { # You don't wish to cache content for logged in users set beresp.http.X-Cacheable = "NO:Got Session"; return(hit_for_pass); #previously just pass but changed in v3+ } else if ( beresp.http.Cache-Control ~ "private") { # You are respecting the Cache-Control=private header from the backend set beresp.http.X-Cacheable = "NO:Cache-Control=private"; return(hit_for_pass); } else if ( beresp.ttl < 1s ) { # You are extending the lifetime of the object artificially set beresp.ttl = 300s; set beresp.grace = 300s; set beresp.http.X-Cacheable = "YES:Forced"; } else { # Varnish determined the object was cacheable set beresp.http.X-Cacheable = "YES"; if (beresp.status == 404 || beresp.status >= 500) { set beresp.ttl = 0s; } # Deliver the content return(deliver); } sub vcl_hash { # Each cached page has to be identified by a key that unlocks it. # Add the browser cookie only if a WordPress cookie found. if ( req.http.Cookie ~"(wp-postpass|wordpress_logged_in|comment_author_)" ) { #set req.hash += req.http.Cookie; hash_data(req.http.Cookie); } } # vcl_recv is called whenever a request is received sub vcl_recv { # remove ?ver=xxxxx strings from urls so css and js files are cached. # Watch out when upgrading WordPress, need to restart Varnish or flush cache. set req.url = regsub(req.url, "\?ver=.*$", ""); # Remove "replytocom" from requests to make caching better. set req.url = regsub(req.url, "\?replytocom=.*$", ""); remove req.http.X-Forwarded-For; set req.http.X-Forwarded-For = client.ip; # Exclude this site because it breaks if cached if ( req.http.host == "sr.ituts.gr" ) { return( pass ); } # Serve objects up to 2 minutes past their expiry if the backend is slow to respond. set req.grace = 120s; # Strip cookies for static files: if (req.url ~ "\.(jpg|jpeg|gif|png|ico|css|zip|tgz|gz|rar|bz2|pdf|txt|tar|wav|bmp|rtf|js|flv|swf|html|htm)$") { unset req.http.Cookie; return(lookup); } # Remove has_js and Google Analytics __* cookies. set req.http.Cookie = regsuball(req.http.Cookie, "(^|;\s*)(__[a-z]+|has_js)=[^;]*", ""); # Remove a ";" prefix, if present. set req.http.Cookie = regsub(req.http.Cookie, "^;\s*", ""); # Remove empty cookies. if (req.http.Cookie ~ "^\s*$") { unset req.http.Cookie; } if (req.request == "PURGE") { if (!client.ip ~ purgehosts) { error 405 "Not allowed."; } #previous version ban() was purge() ban("req.url ~ " + req.url + " && req.http.host == " + req.http.host); error 200 "Purged."; } # Pass anything other than GET and HEAD directly. if (req.request != "GET" && req.request != "HEAD") { return( pass ); } /* We only deal with GET and HEAD by default */ # remove cookies for comments cookie to make caching better. set req.http.cookie = regsub(req.http.cookie, "1231111111111111122222222333333=[^;]+(; )?", ""); # never cache the admin pages, or the server-status page, or your feed? you may want to..i don't if (req.request == "GET" && (req.url ~ "(wp-admin|bb-admin|server-status|feed)")) { return(pipe); } # don't cache authenticated sessions if (req.http.Cookie && req.http.Cookie ~ "(wordpress_|PHPSESSID)") { return(lookup); } # don't cache ajax requests if(req.http.X-Requested-With == "XMLHttpRequest" || req.url ~ "nocache" || req.url ~ "(control.php|wp-comments-post.php|wp-login.php|bb-login.php|bb-reset-password.php|register.php)") { return (pass); } return( lookup ); } Varnish Daemon options DAEMON_OPTS="-a :80 \ -T 127.0.0.1:6082 \ -f /etc/varnish/ituts.vcl \ -u varnish -g varnish \ -S /etc/varnish/secret \ -p thread_pool_add_delay=2 \ -p thread_pools=8 \ -p thread_pool_min=100 \ -p thread_pool_max=1000 \ -p session_linger=50 \ -p session_max=150000 \ -p sess_workspace=262144 \ -s malloc,5G" Im not sure where to start, should i for start optimize php-fpm and then go to varnish or php-fpm is at its max right now so i should start looking for the problem in varnish?

    Read the article

  • Can I use array_push on a SESSION array in php?

    - by zeckdude
    I have an array that I want on multiple pages, so I made it a SESSION array. I want to add a series of names and then on another page, I want to be able to use a foreach loop to echo out all the names in that array. This is the session: $_SESSION['names'] I want to add a series of names to that array using array_push like this: array_push($_SESSION['names'],$name); I am getting this error: array_push() [function.array-push]: First argument should be an array Can I use array_push to put multiple values into that array? Or perhaps there is a better, more efficient way of doing what I am trying to achieve?

    Read the article

  • Best way to Fingerprint and Verify html structure.

    - by Lukas Šalkauskas
    Hello there, I just want to know what is your opinion about how to fingerprint/verify html/links structure. The problem I want to solve is: fingerprint for example 10 different sites, html pages. And after some time I want to have possibility to verify them, so is, if site has been changed, links changed, verification fails, othervise verification success. My base Idea is to analyze link structure by splitting it in some way, doing some kind of tree, and from that tree generate some kind of code. But I'm still in brainstorm stage, where I need to discuss this with someone, and know other ideas. So any ideas, algos, and suggestions would be usefull.

    Read the article

  • Wordpress FORCE UPDATE of permalink settings

    - by Scott B
    I've been having issues on creating new wordpress blogs where I'm setting permalinks via script on theme activation. However, even though they appear to be correct when I check the permalink settings in WP, my new pages are throwing 404 errors. The only fix I've found is that I have to go back to permalink options and click "Save Changes", even though, according to the display, I've made no changes to need to save... I'm setting permalinks to /%postname%/ Here's how I'm doing it. if(get_option('permalink_structure')==""){update_option('permalink_structure', '/%postname%/');} That script gets run when my theme is activated. Any ideas why it only partially does the job?

    Read the article

  • is it possible to do partial postback on web?

    - by carter-boater
    Hi all, I read some paragraphs in a book saying that it is not possible to do a partial postback for web, even AJAX is employed. Ajax will postback everything and update only ajaxfied controls. However, on pages I made using ajax, I used Fiddler to monitor the transportation. I found when the page initial load, it loaded everything include pictures .... However, when I click a button and do a ajax postback. I can only see the some data were loaded.... Looks like it doesn't need to reload the whole page again. I don't know if what I see is correct? Or the book I read is correct? Thank you guys.

    Read the article

  • Add a different ID to each li element by jQuery

    - by Machi
    Hi guys, I'm new here and I'd like to ask a question about jQuery. I have the unordered list like: <ul id="pages"> <li class="something"><a href="#"></a></li> <li class="something"><a href="#"></a></li> <li class="something"><a href="#"></a></li> </ul> And I'd like to add a different ID to every li displayed in this <ul> (eg. <li class="something" id="li1">...). Is there a way how to achieve this via jQuery? Thanks a lot, Jakub

    Read the article

  • Django: tinyMCE and cross site javascript

    - by pistacchio
    Hi, follow this question, I was able to set some textareas in my admin page as richtext inputs. The most voted answer suggests to follow an example and is what i did. Also, it talks about a problem concerning "blank pages". I'm having the same problem and I'm not able to solve it. I have my media files served by a different server, so MEDIA_URL points to a different host with a different port. To simulate this in my dev environment, I also serve media files from a different port. Debugging the failing javascript, Chrome yelds: Unsafe JavaScript attempt to access frame with URL http://localhost:8000/admin/blog/post/add/ from frame with URL http://localhost:88/s3mangerie/js/tiny_mce/themes/advanced/image.htm. Domains, protocols and ports must match. How to solve this problem? Thanks

    Read the article

  • Slow draw on some apps and dynamic clocks not working properly with ATI/AMD proprietary drivers

    - by Rakeka
    I've recently purchased a new computer (around July 2010) and I've been having some problems with proprietary video drivers on Linux. The hardware is: Video: ATI/AMD Radeon HD 5870 (XFX HD-587X-ZNFC); Motherboard: Asus P7P55D-E Deluxe; Processor: Intel i5 750; Memory: Kingston Hyperx KHX1600C8D3K2/4GX (2x - 8GB Total); Power Supply: XFX P1-750B-CAG9; There are no overclocks, not even the memories (they are at 1333mhz due processor memory controller limitation). The operational system is a homebrew Linux distribution with the following software: Architecture: x86_64 (multilib) Kernel: 2.6.35.10 Xorg: 7.5 Window Manager: wmii-3.9.2 Video Driver: ATI/AMD Catalyst 10.12 There are no desktop effects programs like compiz fusion or beryl. The problems: With ATI/AMD proprietary driver, some applications are with slow draw/redraw, and, the same applications make the driver to increase the card clocks to maximum (0% gpu activity, only the clocks are increased). I dunno exactly how to describe the slow draw but I'll list some applications and symptoms. xterm Flickers a lot when drawing continuous output; When I'm in a workspace with fullscreen xterm, The gpu load stays at 12% in idle, and, with smaller xterm, smaller GPU load. "aticonfig --odgc" output: Default Adapter - ATI Radeon HD 5800 Series Core (MHz) Memory (MHz) Current Clocks : 157 300 Current Peak : 850 1200 Configurable Peak Range : [600-900] [900-1300] GPU load : 12% "aticonfig --pplib-cmd 'get activity'" output: Current Activity is Core Clock: 157MHZ Memory Clock: 300MHZ VDDC: 950 Activity: 12 percent Performance Level: 0 Bus Speed: 5000 Bus Lanes: 16 Maximum Bus Lanes: 16 More examples: mplayer time info flickers on terminal; "find /" flickers a lot (It takes some time to stop with control-c. But, If I change the workspace or put some window upon it, just after the control-c, it stops instantly); "cat somefile" if the file is big (Xorg.0.log for example) it takes some time to display; vim and less (ex: find / | less) don't have much problems, just a little flicker when scrolling; mplayer (no gui) Slow reproduction and seek with -vo x11; Tearing with -vo xv; Time info flickers on terminal (xterm consequence); gvim A little slow draw when scrolling with page up/page down; Firefox Slow draw/redraw on some pages like www.boadica.com.br and sometimes on www.youtube.com with flash enable (never noticed on many pages); Corruptions when informative yellow boxes are showing and scroll the page (an gray box appears at the same place of the informative box); "Wallpaper" After minimizing a fullscreen window or changing to an empty workspace it takes some time to redraw wallpaper. "Video Card" The core and memory clocks are increased with the events described above and on other situations like change workspace (even without wallpaper), minimize, maximize or move a window; Idle clocks: Core: 157mhz, Memory: 300mhz Full clocks: Core: 850mhz, Memory: 1200mhz xpdf Painful slow scrolling; display (from ImageMagick) Slow menus and sometimes slow image redraw; Programs that I use and are apparently without problems: gimp; pidgin; mplayer (-vo gl, gl2); blender; unigine heaven (better fps than on Windows); doom3; tibia; penumbra overture; amnesia the dark descent (wine); diablo 2 (wine); No problems on Windows (Windows 7 Ultimate 64bit). And special note to this: Full desktop effects from Debian and Ubuntu gnome appearance cpanel don't cause ANY problems, even the core and memory clocks don't increase when change workspace, minimize, maximize or move a window. What I've tested: Unsuccessful tests: Tested all drivers versions since 10.6 (released approximately when I've installed the first slackware in this PC); Tested other video card - ATI/AMD Radeon HD 5570 (XFX HD-557X-ZHF2); Tested some options on xorg.conf and that I've found googling (some of these options are commented on my xorg.conf. I'll send the links at the end of post); Tested some patches like 107_fedora_dont_fill_bg_none.patch and xserver-xorg-backclear.patch from Arch Linux Catalyst page (https://wiki.archlinux.org/index.php/ATI_Catalyst); Tested other distros and software versions: Tested XORG-7.6 on my own distribution; Tested Debian Squeeze (testing - from 2010-12-20); Tested Ubuntu Marverick (10.10); Tested Slackware 13.1; Distros info: Architecture: i386 Debian and Ubuntu with all default software (kernel, gnome, xorg, drivers); Slackware with Catalyst from AMD page and default window managers like: fvwm, xfce, and my own build of wmii; Successful tests: Tested other video card (only on my homebrew distro) - NVIDIA Geforce 7300GS with driver 260.19.29; That didn't shown the slow draw problems, but that card is a bit obsolete, so, dunno if that lacks features like the dynamic clocks. I don't dispose of other video cards like nvidia g/gt/gts/gtx 200~400~500 or Radeon HD 3000/4000/6000 to make more tests. Tested other hardware: Video: ATI/AMD Radeon HD 5570 (XFX HD-557X-ZHF2); Motherboard: Intel DG31PR; Processor: Core 2 Duo E6750; Software for that hardware: Fresh install of same distros (except for the mine) with same program versions; That video card (HD 5570) were full time at the maximum clocks (something like 500/750, don't remember) in all the operational systems (Windows XP and Windows 7 too), but it didn't shown the same problems that I have here. I've googled a lot about common problems with ATI/AMD proprietary drivers for Linux and didn't find similar problems, except by the Firefox corruptions, that the solutions were to disable ATI Direct2DAccel and use XAA. With XAA the problems persists and the other applications like pidgin and rest of Firefox showed the same problems of slow draw/redraw. Open source Drivers: With open source drivers (xf86-video-ati-6.13.2) I hadn't the same slow draw problems, but, had other problems, that, for now, make it no viable solution. I'll not discuss it here because this is another line of problems and will confuse everything. If it happens to be the only solution, I'll make another thread to discuss it. Logs and Configs: kernel .config dmesg xorg package list xorg.conf Xorg.0.log

    Read the article

  • How can developers use a similar tracking link to Google's results page?

    - by Peter Jones
    I've read heaps of pages of people trying to implement some kind of tracking system similar to the way Google reroutes search link. Eg: Search "Facebook" in Google, open in a new window, and the link changes to something like: "http://www.google.com.au/url?sa=t&source=web&cd=1&ved=0CBkQFjAA&url=http%3A%2F%2Fwww.facebook.com%2F&rct=j&q=Facebook&ei=sksZTZexJobJccXnxZYK&usg=AFQjCNHTTNi-O4Qgrg6kvGVfKJuRqbuOKw&cad=rja" I'm guessing Google tracks that click and then redirects to the actual site by reading the url parameter. What I wanted to know is if there was a simple way that you can make this kind of functionality work using an onclick event - just change the link href after being clicked to redirect? There's a few threads, but from what I could find, nobody has actually succeeded without problems or limitations. Thanks in advance.

    Read the article

  • How much should I charge for Rails programming?

    - by Oskar Gantt
    I have been asked to quote an hourly rate for freelance programming for a Rails project. Although it would be my first paid project on Rails, I know the technology well from personal projects and have a decade of professional programming experience. This would be my first freelance project ever, so I have no idea how to find out what the going rate for my services should be. Obviously, if I quote a rate that is too high, they may choose someone else - too low and I may feel cheated later on. Any suggestions? Update: I am in NYC and the project is scheduled for 6 months to a year (but this seems unrealistic - I think it will be a multi-year project). I would develop on site (at a corporate location) with one other developer and the project would consist of about 200 custom-built pages initially. 10 hour days with weekends and additional overtime as required. The customer has given no information about how much they will pay - "a competitive rate" - they want me to start the discussion.

    Read the article

  • asp.net Dynamic Data Site with custom own MetaData

    - by loviji
    Hello, I'm searching info about configuring own MetaData in asp.NET Dynamic Site. For example. I have a table in MS Sql Server with structure shown below: CREATE TABLE [dbo].[someTable]( [id] [int] NOT NULL, [pname] [nvarchar](20) NULL, [FullName] [nvarchar](50) NULL, [age] [int] NULL) and I there are 2 Ms Sql tables (I've created), sysTables and sysColumns. sysTables: ID sysTableName TableName TableDescription 1 | someTable |Persons |All Data about Persons in system sysColumns: ID TableName sysColumnName ColumnName ColumnDesc ColumnType MUnit 1 |someTable | sometable_pname| Name | Persona Name(ex. John)| nvarchar(20) | null 2 |someTable | sometable_Fullname| Full Name | Persona Name(ex. John Black)| nvarchar(50) | null 3 |someTable | sometable_age| age | Person age| int | null I want that, in Details/Edit/Insert/List/ListDetails pages use as MetaData sysColumns and sysTableData. Because, for ex. in DetailsPage fullName, it is not beatiful as Full Name . someIdea, is it possible? thanks

    Read the article

< Previous Page | 265 266 267 268 269 270 271 272 273 274 275 276  | Next Page >