I am using Next.js and have created a website with a focus on server-side rendering. In my components, I even use 'use server' to ensure that my pages and components are rendered on the server side. My main focus is on SEO. There are web crawlers like Screaming Frog that check the page, find links, and provide results on what is bad for SEO to help fix mistakes and improve. However, when I run the crawler, only my main page is shown. No other links are found by the crawlers. Why and how to fix it? For example this page is not found. Here is source code:
layout.js
<Menu></Menu>
menu.jsx
` ‘use server’
import React from ‘react’
import styles from ‘./menu.module.css’
import Link from ‘next/link’;
import menuItemsService from ‘@/utils/menuItems’;
export const Menu = () => {
const items = menuItemsService.getUserMenuItems();
return (
<div className={styles.menu}>
{items.map((item, index) => (
<Link key={index} className={styles.menuItem} href={item.link}>{item.label}</Link>
))}
</div>
)
}`
menuitems.js
`const menuItemsService = {
getMenuItems() {
return [
{
label: "Test",
link: "/test",
onlyAdmin: false,
footer: false
}
]
}}`
page.jsx
`’use server’
import React from ‘react’
export const Test = () => {
return (
Test
)
}
export default Test;`
screaming from results
Tried to use ‘use server’ to assure server side rendering.