Posts by: [email protected]

Posts: 54
Post: 5940
Topic: index.php/htaccess/permalinks

Yes, I will check that Josh, on as far as I know. In any event, a few paths sorted out and everything working now, including a non www to www 301 redirect, which you could reverse if that's a preferred option.

Can't guarantee which versions/servers the file will work on, or whether every part is essential. I worked with the suggested file as a base and didn't remove anything doing no harm. Seemed worth posting in case this is of use.

<IfModule mod_rewrite.c>
RewriteEngine On

RewriteCond %{http_host} ^yourdomain.com [NC]
RewriteRule ^(.*)$ http://www.yourdomain.com/$1 [L,R=301,NC]

<IfModule mod_env.c>
SetEnv gp_rewrite 5rWRYcc
</IfModule>

RewriteBase "/"
RewriteRule ^index\.php$ - [L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
<IfModule mod_cache.c>
RewriteRule /?(.*) "/index.php?$1" [qsa,L]
</IfModule>
<IfModule !mod_cache.c>
RewriteRule . "/index.php" [L]
</IfModule>
</IfModule>

7 years ago
Post: 5937
Topic: index.php/htaccess/permalinks

For anyone else in the same position. Tried a variety of versions without much difference, also any mods one by one with full testing, not the issue. Josh's htaccess file worked fine but much else broke once the config file was amended. Page editing and most functions gone, same for every other option I tried with gp_index.php set to false, except for (which is also I imagine produced by Josh)

# BEGIN gpEasy
<IfModule mod_rewrite.c>
<IfModule mod_env.c>
SetEnv gp_rewrite 5rWRYcc
</IfModule>
RewriteEngine On
RewriteBase "/"
RewriteRule ^index\.php$ - [L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
<IfModule mod_cache.c>
RewriteRule /?(.*) "/index.php?$1" [qsa,L]
</IfModule>
<IfModule !mod_cache.c>
RewriteRule . "/index.php" [L]
</IfModule>
</IfModule>
# END gpEasy

Still a bit to do, such as get www/non www sorted and no content/new pages added yet but in principle, the file above and the change in the config file seem to work together, after a careful, fresh install. Have thought similar earlier today, then found a few problems. So at least I hope they do, some sleep now while I can tell myself that:)

7 years ago
Post: 5936
Topic: index.php/htaccess/permalinks

Hi Josh

Unfortunately a fair amount still broken, also tried your htaccess and a combination. Been at this too long now and confused, will take a break, try another completely fresh start.

A bit concerned if this will effect others as the update gathers pace. We have a good number of broken sites and appears the issue is not confined to gpEasy. As you saw above, others having similar problems.

7 years ago
Post: 5935
Topic: index.php/htaccess/permalinks

Hi Josh

Having read through the version notes (about time) I noticed the reference to mod_rewrite using ? with qsa for 3.0.5. Just tried installing this, setting gp_index.php to false and using the suggested htaccess entry from the permalinks box:

# BEGIN gpEasy
<IfModule mod_rewrite.c>
<IfModule mod_env.c>
SetEnv gp_rewrite 5rWRYcc
</IfModule>
RewriteEngine On
RewriteBase "/"
RewriteRule ^index\.php$ - [L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
<IfModule mod_cache.c>
RewriteRule /?(.*) "/index.php?$1" [qsa,L]
</IfModule>
<IfModule !mod_cache.c>
RewriteRule . "/index.php" [L]
</IfModule>
</IfModule>
# END gpEasy

This seems to be working okay but have not checked through much yet. Have had everything apparently okay a few times but always found something broken.

I'm guessing the CSS changes etc. for 3.0.5 won't be so different from 3.0.2, or odd config changes. I will also try your suggested htaccess entry and see how best to combine with a www rule.

Beyond that, just make changes very gradually and test functions at each stage, see if anything fails. Did get a fair way down the road a couple of times but either functions failed, or the page manager half vanished.

The current attempt looks promising, will see. Thanks very much for taking the time to help.

 

7 years ago
Post: 5932
Topic: index.php/htaccess/permalinks

Just to update, the issue is caused by a host update to Apache 2.4.4. There's a sort of related article here the host referenced:

http://ellislab.com/expressionengine/user-guide/cp/admin/output_and_debugging_preferences.html

In essence:

RewriteRule ^index\.php$ - [L]

needs to become:

RewriteRule ^index\.php?$ - [L]

You still also need to set gp_indexphp to false in the gpconfig.php file. This fixes the index.php problem but also breaks just about every other function. An attempt at editing just produces:

"There was an error processing the last request. Please reload this page to continue."

Other core functions also broken.

I imagine this is an update which will gradually roll out across different hosts, so looks like a wider problem for various CMS, including gpEasy. Any suggestions welcome.

7 years ago
Post: 5929
Topic: index.php/htaccess/permalinks

Hi

Hope someone has a suggestion. We have a problem redirecting index.php site wide. Due to mods we are using gpEasy 3.0.2

The permalink setting is not working but used to this to a degree. The type of file we have used in the past is however no longer working:

<IfModule mod_rewrite.c>
    RewriteEngine On

    RewriteCond %{http_host} ^domain.com [NC]
    RewriteRule ^(.*)$ http://www.domain.com/$1 [L,R=301,NC]

    <IfModule mod_env.c>
        SetEnv gp_rewrite gJjsF6q
    </IfModule>

    RewriteBase "/"
    RewriteRule ^index\.php$ - [L]
    RewriteCond %{REQUEST_FILENAME} !-f
    RewriteCond %{REQUEST_FILENAME} !-d
    <IfModule mod_cache.c>
        RewriteRule /?(.*) "/index.php/$1" [L]
    </IfModule>
    <IfModule !mod_cache.c>
        RewriteRule . "/index.php" [L]
    </IfModule>
</IfModule>

Also tried the suggested file option to hide index.php with the CMS:

<IfModule mod_rewrite.c>
<IfModule mod_env.c>
SetEnv gp_rewrite xu8Ji9x
</IfModule>
RewriteEngine On
RewriteBase "/"
RewriteRule ^index\.php$ - [L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
<IfModule mod_cache.c>
RewriteRule /?(.*) "/index.php/$1" [L]
</IfModule>
<IfModule !mod_cache.c>
RewriteRule . "/index.php" [L]
</IfModule>
</IfModule>

Neither will do the job any longer. Also tried setting gp_indexphp to false in your gpconfig.php file. Home/root page only will function, the rest simply give - "No input file specified." Or with the htaccess empty to avoid any possible conflict- "404 Not Found."

I did see "A false setting without the necessary mod_rewrite settings will break site navigation" in the docs and this is the reality.

There seems a possibility the host carried out an update which might have had an effect but they are suggesting there is nothing abnormal.

Wondered if anyone has seen similar issues recently, or has an idea to help.

Thanks

7 years ago
Post: 5425
Topic: UK Cookie law

Does the UK have it's own cookie law or is it the same as the EU Cookie Law

Much EU "law" consists of directives, which are then enacted by the individual countries, often with a degree of flexibility. Even where they are more than directives, the same tends to apply. There are exceptions but most EU law would be better seen as EU principles of law. So UK cookie legislation is not separate as such but implementation may be slightly different than in other countries.

In the UK, the Information Commissioners Office is responsible for applying the law. There's a little more in the thread Josh mentioned.

As far as non-tracking cookies essential to site use go, wouldn't worry too much.

7 years ago
Post: 5401
Topic: EU Cookie Law

That is a helpful page and the link to the ICO cookie information worth following. There is though a degree of difference between reality and theory, summed up in:

"This means in theory websites need to tell people about analytical cookies and gain their consent.

In practice we would expect you to provide clear information to users about analytical cookies and take what steps you can to seek their agreement. This is likely to involve making the argument to show users why these cookies are useful. Although the Information Commissioner cannot completely exclude the possibility of formal action in any area, it is highly unlikely that priority for any formal action would be given to focusing on uses of cookies where there is a low level of intrusiveness and risk of harm to individuals. Provided clear information is given about their activities we are highly unlikely to prioritise first party cookies used only for analytical purposes in any consideration of regulatory action."

The letter of the law suggests all sites using cookie type analytics (which is most) should get agreement from users. What the paragraph above does is acknowledge this but in a quiet way, say they are not going to enforce that, as is the case for a few similar situations. Two reasons:

They would need to get into a legal argument with 15 million websites, not very practical.

The damage to the UK economy would be unacceptable. If you trawl through the ICO site, you can find minutes of their management meetings. At one they decided to implement the literal requirement in a bold way on their own site, more or less stating they expected not too much effect. By the next meeting, they discovered their site use had plummeted and that is on a trusted site.

They then somewhat stepped back from overkill and many other sites who initially followed the letter of the law more so, including major websites. Plenty of people reported 100% or more increase in bounce rate due to a quite correct popup, or equivalent. Their business was being destroyed.

A common approach is to:

1. Add a prominent link on every page, maybe included in the footer to "Cookies" "Cookie Information" or whatever.

2. Write a page explaining exactly what cookies a site is using and why. Preferably with further information on what cookies are, links to how users can work with them in different browsers etc.

If you are only using basic first party cookies, you may be theoretically not in line with the law but can be seen to be making an effort and in reality, should be okay. The other option is to risk damage when you may not need to.

Even apart from problems re location in other countries, this law was not well thought out, implications not grasped. What needed dealing with were aspects such as third party advertiser cookies, or other personal tracking implementations. If you are using those you should follow the law, or could one day regret not doing so.

7 years ago
Post: 5367
Topic: regarding robots.txt

Josh is right, many sites don't need sitemaps. Google and perhaps Bing are capable of finding most of a site by following internal links, even if not crawled on the visit they are found, they will be parsed and go into a crawl queue, much as they would from a sitemap.

An xml sitemap is more likely to be of use for sites with complex architecture, URLs which are hard to track internally, or very large sites, or sites that are desperate to have pages indexed a little quicker (no guarantee). Can also be some use with a site that has canonicalisation issues, to reinforce preferred URLs; or where there have suddenly been a lot of URL changes.

As an incidental point, no reason to have URLs you are blocking with robots.txt in a sitemap and reasons not to. robots.txt does not stop URLs being indexed, just stops the content being crawled.

Googlebot in particular is becoming more invasive, so if anyone wants to block sections they don't want indexed, not a terrible idea but you need to be careful. If for example you block /data/ and are using the standard upload facility, your images will not be indexed. If the images are all you are concerned about and you want to block everything conceivable otherwise, you could in theory try:

User-agent: *
Allow: /data/_uploaded/image/
Disallow: /data/
Disallow: /addons/
Disallow: /include/
Disallow: /themes/

Sitemap: http://www.yourdomain.com/sitemap.xml

Please bear in mind that is just a broad example and as Josh mentioned, each site should make their own decisions. There will be plenty of situations where that file would be wrong.

You should also in principle allow crawling of css and javascript files, partly as Google have requested this but also as there may be URLs in those files you want robots to follow. So depending on set up, you would need to add other exceptions at the top such as:

Allow: /data/_themes/primary/main_pages/style.css (or whatever applies)

Would be possible to end up with quite a few exceptions and all too easy to make errors. If you have access, the best thing is to think about structure before putting up the site and make life easier.

If you are not sure on any point, don't worry about it. Many more important aspects to spend time on for good indexing/search results, not least what is on the site.

7 years ago
Post: 5359
Topic: about hiding "Powered by gpEasy CMS message" in fo

b) hide to site visitors (in which case the powered by link is not removed from html but is not shown to site visitors or logged in users as css style (display:none) is applied to that anchor html tag.

That's a pretty good way to get the gpEasy website penalised, so probably not what they want.

7 years ago

News

elFinder 2.1.50 in Upcoming Release
12/28/2019

A new release for Typesetter is in the works with a lot of improvements including the ... Read More

Typesetter 5.1
8/12/2017

Typesetter 5.1 is now available for download. 5.1 includes bug fixes, UI/UX improvements, ... Read More

More News

HH-Support

Company located in T├│rshavn, Faroe Islands. * Webpage Design * Consultant & Provider of a wide range of programs for visually impaired and dyslextics.

Find out more about our Provider Spotlight

Log In

  Register